Text Generation
Transformers
PyTorch
Hungarian
English
llama
conversational
text-generation-inference
Inference Endpoints
zolicsaki commited on
Commit
4ec76bb
1 Parent(s): 110dbfd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +10 -8
README.md CHANGED
@@ -25,7 +25,6 @@ SambaLingo-Hungarian-Chat-70B is a human aligned chat model trained in Hungarian
25
  - **Model type:** Language Model
26
  - **Language(s):** Hungarian, English
27
  - **Finetuned from model:** [Llama-2-70b](https://huggingface.co/meta-llama/Llama-2-70b-hfΩ)
28
- - **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
29
  - **Paper:** [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
30
  - **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
31
 
@@ -96,6 +95,9 @@ The DPO phase was done on the [ultrafeedback](https://huggingface.co/datasets/Hu
96
  ## Tokenizer Details
97
  We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
98
 
 
 
 
99
  ## Uses
100
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
101
 
@@ -138,12 +140,12 @@ We would like to give a special thanks to the following groups:
138
 
139
  ## Cite SambaLingo
140
  ```
141
- @software{sambalingo,
142
- title = {{SambaLingo: Open Source Language Experts}},
143
- author = {SambaNova Systems},
144
- url = {https://huggingface.co/sambanovasystems/SambaLingo-Hungarian-Chat-70B}
145
- month = {2},
146
- year = {2024},
147
- version = {1.0},
148
  }
149
  ```
 
25
  - **Model type:** Language Model
26
  - **Language(s):** Hungarian, English
27
  - **Finetuned from model:** [Llama-2-70b](https://huggingface.co/meta-llama/Llama-2-70b-hfΩ)
 
28
  - **Paper:** [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
29
  - **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
30
 
 
95
  ## Tokenizer Details
96
  We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
97
 
98
+ ## Evaluation
99
+ For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
100
+
101
  ## Uses
102
  <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
103
 
 
140
 
141
  ## Cite SambaLingo
142
  ```
143
+ @misc{csaki2024sambalingo,
144
+ title={SambaLingo: Teaching Large Language Models New Languages},
145
+ author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
146
+ year={2024},
147
+ eprint={2404.05829},
148
+ archivePrefix={arXiv},
149
+ primaryClass={cs.CL}
150
  }
151
  ```