|
--- |
|
license: llama2 |
|
--- |
|
|
|
Llama-2-Amharic |
|
|
|
Uses Llama-2-7b as base. Does not use chat variant. |
|
|
|
You can use the inference demo script provided to run inference. Note that it expects to be executed in context of llama recipes (https://github.com/facebookresearch/llama-recipes/tree/main/src/llama_recipes/inference) |
|
|
|
Ensure that you replace the Llama-2 tokenizer with the Llama-2-Amharic tokenizer. |
|
|
|
To avoid hallucinations use low top-k. |
|
|
|
Use format "$system\nHuman: $prompt\nAssistant [Amharic] : " |
|
|
|
Specify Amharic or English in the prompt to control the response language. Prompt can be in either language. |
|
|
|
Example: |
|
|
|
"Below is an interaction between a human and an AI fluent in English and Amharic, providing reliable and informative answers. |
|
The AI is supposed to answer test questions from the human with short responses saying just the answer and nothing else. |
|
|
|
Human: α ααα αααα α¨αα½α αα½α αααͺα« αα΄αα ααα΅α°ααα
α¨α
α α₯αα α»αα’ α’α«αα΅ 3 α αααΎα½ ααα α αα£αΈαα’ |
|
|
|
Assistant [Amharic] : " |
|
|
|
Cite: |
|
|
|
``` |
|
@misc{andersland2024amharic, |
|
title={Amharic LLaMA and LLaVA: Multimodal LLMs for Low Resource Languages}, |
|
author={Michael Andersland}, |
|
year={2024}, |
|
eprint={2403.06354}, |
|
archivePrefix={arXiv}, |
|
primaryClass={cs.CL} |
|
} |
|
``` |
|
|