huangzixian
commited on
Commit
•
18b5bb2
1
Parent(s):
a8bdb1e
update readme
Browse files
README.md
CHANGED
@@ -1,11 +1,9 @@
|
|
1 |
|
2 |
### Model Sources
|
3 |
|
4 |
-
Paper
|
5 |
-
|
6 |
-
|
7 |
-
|
8 |
-
Repository: https://github.com/CONE-MT/
|
9 |
|
10 |
### Model Description
|
11 |
LLaMAX is a multilingual language model, developed through continued pre-training on Llama3, and supports over 100 languages.
|
@@ -17,7 +15,7 @@ Its translation capabilities far exceed general models of the same scale, and it
|
|
17 |
LLaMAX supports translation between more than 100 languages, surpassing the performance of similarly scaled LLMs.
|
18 |
|
19 |
```angular2html
|
20 |
-
def
|
21 |
instruction = f'Translate the following sentences from {src_language} to {trg_language}.'
|
22 |
prompt = (
|
23 |
'Below is an instruction that describes a task, paired with an input that provides further context. '
|
@@ -49,11 +47,9 @@ tokenizer.batch_decode(generate_ids, skip_special_tokens=True, clean_up_tokeniza
|
|
49 |
LLaMAX preserves its efficacy in general tasks and improves the performance on multilingual tasks.
|
50 |
We fine-tuned LLaMAX using only the English training set of downstream task, which also shows significant improvements in non-English. We provide fine-tuning LLaMAX models for the following three tasks:
|
51 |
|
52 |
-
Math Reasoning
|
53 |
-
|
54 |
-
|
55 |
-
|
56 |
-
Natural Language Inference: https://huggingface.co/LLaMAX/LLaMAX2-7B-XNLI
|
57 |
|
58 |
|
59 |
### Supported Languages
|
|
|
1 |
|
2 |
### Model Sources
|
3 |
|
4 |
+
- **Paper**: LLaMAX: Scaling Linguistic Horizons of LLM by Enhancing Translation Capabilities Beyond 100 Languages
|
5 |
+
- **Link**:
|
6 |
+
- **Repository**: https://github.com/CONE-MT/LLaMAX/
|
|
|
|
|
7 |
|
8 |
### Model Description
|
9 |
LLaMAX is a multilingual language model, developed through continued pre-training on Llama3, and supports over 100 languages.
|
|
|
15 |
LLaMAX supports translation between more than 100 languages, surpassing the performance of similarly scaled LLMs.
|
16 |
|
17 |
```angular2html
|
18 |
+
def Prompt_template(query, src_language, trg_language):
|
19 |
instruction = f'Translate the following sentences from {src_language} to {trg_language}.'
|
20 |
prompt = (
|
21 |
'Below is an instruction that describes a task, paired with an input that provides further context. '
|
|
|
47 |
LLaMAX preserves its efficacy in general tasks and improves the performance on multilingual tasks.
|
48 |
We fine-tuned LLaMAX using only the English training set of downstream task, which also shows significant improvements in non-English. We provide fine-tuning LLaMAX models for the following three tasks:
|
49 |
|
50 |
+
- **Math Reasoning**: https://huggingface.co/LLaMAX/LLaMAX2-7B-MetaMath
|
51 |
+
- **Commonsense Reasoning**: https://huggingface.co/LLaMAX/LLaMAX2-7B-X-CSQA
|
52 |
+
- **Natural Language Inference**: https://huggingface.co/LLaMAX/LLaMAX2-7B-XNLI
|
|
|
|
|
53 |
|
54 |
|
55 |
### Supported Languages
|