huangzixian
commited on
Commit
•
ec7f931
1
Parent(s):
8924ae7
update readme
Browse files
README.md
CHANGED
@@ -2,7 +2,7 @@
|
|
2 |
|
3 |
- **Paper**: "LLaMAX: Scaling Linguistic Horizons of LLM by Enhancing Translation Capabilities Beyond 100 Languages"
|
4 |
|
5 |
-
- **Link**:
|
6 |
|
7 |
- **Repository**: https://github.com/CONE-MT/LLaMAX/
|
8 |
|
@@ -88,5 +88,13 @@ We implement multiple versions of the LLaMAX model, the model links are as follo
|
|
88 |
If our model helps your work, please cite this paper:
|
89 |
|
90 |
```
|
91 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
92 |
```
|
|
|
2 |
|
3 |
- **Paper**: "LLaMAX: Scaling Linguistic Horizons of LLM by Enhancing Translation Capabilities Beyond 100 Languages"
|
4 |
|
5 |
+
- **Link**: https://arxiv.org/pdf/2407.05975
|
6 |
|
7 |
- **Repository**: https://github.com/CONE-MT/LLaMAX/
|
8 |
|
|
|
88 |
If our model helps your work, please cite this paper:
|
89 |
|
90 |
```
|
91 |
+
@misc{lu2024llamaxscalinglinguistichorizons,
|
92 |
+
title={LLaMAX: Scaling Linguistic Horizons of LLM by Enhancing Translation Capabilities Beyond 100 Languages},
|
93 |
+
author={Yinquan Lu and Wenhao Zhu and Lei Li and Yu Qiao and Fei Yuan},
|
94 |
+
year={2024},
|
95 |
+
eprint={2407.05975},
|
96 |
+
archivePrefix={arXiv},
|
97 |
+
primaryClass={cs.CL},
|
98 |
+
url={https://arxiv.org/abs/2407.05975},
|
99 |
+
}
|
100 |
```
|