Text Generation
Transformers
Safetensors
Thai
English
llama
conversational
Eval Results
Inference Endpoints
text-generation-inference
zolicsaki commited on
Commit
ef1cf3f
1 Parent(s): eb0c27e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -7
README.md CHANGED
@@ -129,6 +129,7 @@ SambaLingo-Thai-Chat is a human aligned chat model trained in Thai and English.
129
  - **Language(s):** Thai, English
130
  - **Finetuned from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)
131
  - **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
 
132
  - **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
133
 
134
  ## Getting Started
@@ -185,6 +186,9 @@ The DPO phase was done on the [ultrafeedback](https://huggingface.co/datasets/Hu
185
  ## Tokenizer Details
186
  We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
187
 
 
 
 
188
  ## Uses
189
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
190
 
@@ -227,13 +231,13 @@ We would like to give a special thanks to the following groups:
227
 
228
  ## Cite SambaLingo
229
  ```
230
- @software{sambalingo,
231
- title = {{SambaLingo: Open Source Language Experts}},
232
- author = {SambaNova Systems},
233
- url = {https://huggingface.co/sambanovasystems/SambaLingo-Thai-Chat}
234
- month = {2},
235
- year = {2024},
236
- version = {1.0},
237
  }
238
  ```
239
 
 
129
  - **Language(s):** Thai, English
130
  - **Finetuned from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)
131
  - **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
132
+ - **Paper:** [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
133
  - **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
134
 
135
  ## Getting Started
 
186
  ## Tokenizer Details
187
  We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
188
 
189
+ ## Evaluation
190
+ For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
191
+
192
  ## Uses
193
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
194
 
 
231
 
232
  ## Cite SambaLingo
233
  ```
234
+ @misc{csaki2024sambalingo,
235
+ title={SambaLingo: Teaching Large Language Models New Languages},
236
+ author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
237
+ year={2024},
238
+ eprint={2404.05829},
239
+ archivePrefix={arXiv},
240
+ primaryClass={cs.CL}
241
  }
242
  ```
243