Update README.md
Browse files
README.md
CHANGED
@@ -26,6 +26,7 @@ SambaLingo-Bulgarian-Chat is a human aligned chat model trained in Bulgarian and
|
|
26 |
- **Language(s):** Bulgarian, English
|
27 |
- **Finetuned from model:** [Llama 2](https://huggingface.co/meta-llama/Llama-2-7b-hf)
|
28 |
- **Try this model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
|
|
|
29 |
- **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
|
30 |
|
31 |
## Getting Started
|
@@ -84,6 +85,9 @@ The DPO phase was done on the [ultrafeedback](https://huggingface.co/datasets/Hu
|
|
84 |
## Tokenizer Details
|
85 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
86 |
|
|
|
|
|
|
|
87 |
## Uses
|
88 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
89 |
|
@@ -126,12 +130,12 @@ We would like to give a special thanks to the following groups:
|
|
126 |
|
127 |
## Cite SambaLingo
|
128 |
```
|
129 |
-
@
|
130 |
-
|
131 |
-
|
132 |
-
|
133 |
-
|
134 |
-
|
135 |
-
|
136 |
}
|
137 |
```
|
|
|
26 |
- **Language(s):** Bulgarian, English
|
27 |
- **Finetuned from model:** [Llama 2](https://huggingface.co/meta-llama/Llama-2-7b-hf)
|
28 |
- **Try this model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
|
29 |
+
- **Paper:** [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
|
30 |
- **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
|
31 |
|
32 |
## Getting Started
|
|
|
85 |
## Tokenizer Details
|
86 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
87 |
|
88 |
+
## Evaluation
|
89 |
+
For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
|
90 |
+
|
91 |
## Uses
|
92 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
93 |
|
|
|
130 |
|
131 |
## Cite SambaLingo
|
132 |
```
|
133 |
+
@misc{csaki2024sambalingo,
|
134 |
+
title={SambaLingo: Teaching Large Language Models New Languages},
|
135 |
+
author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
|
136 |
+
year={2024},
|
137 |
+
eprint={2404.05829},
|
138 |
+
archivePrefix={arXiv},
|
139 |
+
primaryClass={cs.CL}
|
140 |
}
|
141 |
```
|