intfloat commited on
Commit
baa7be4
1 Parent(s): c406bc6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -9
README.md CHANGED
@@ -5382,10 +5382,7 @@ license: mit
5382
 
5383
  ## Multilingual-E5-large-instruct
5384
 
5385
- [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
5386
- Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
5387
-
5388
- [Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/abs/2402.05672).
5389
  Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
5390
 
5391
  This model has 24 layers and the embedding size is 1024.
@@ -5518,11 +5515,11 @@ so this should not be an issue.
5518
  If you find our paper or models helpful, please consider cite as follows:
5519
 
5520
  ```
5521
- @article{wang2022text,
5522
- title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
5523
- author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
5524
- journal={arXiv preprint arXiv:2212.03533},
5525
- year={2022}
5526
  }
5527
  ```
5528
 
 
5382
 
5383
  ## Multilingual-E5-large-instruct
5384
 
5385
+ [Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/pdf/2402.05672).
 
 
 
5386
  Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
5387
 
5388
  This model has 24 layers and the embedding size is 1024.
 
5515
  If you find our paper or models helpful, please consider cite as follows:
5516
 
5517
  ```
5518
+ @article{wang2024multilingual,
5519
+ title={Multilingual E5 Text Embeddings: A Technical Report},
5520
+ author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Yang, Linjun and Majumder, Rangan and Wei, Furu},
5521
+ journal={arXiv preprint arXiv:2402.05672},
5522
+ year={2024}
5523
  }
5524
  ```
5525