intfloat osanseviero HF staff commited on
Commit
97e13e8
1 Parent(s): ffdcc22

Add link to new paper (#7)

Browse files

- Add link to new paper (f470c6a1a906014160ece1968c484b275f0396de)


Co-authored-by: Omar Sanseviero <osanseviero@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -5953,6 +5953,9 @@ license: mit
5953
  [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
5954
  Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
5955
 
 
 
 
5956
  This model has 12 layers and the embedding size is 384.
5957
 
5958
  ## Usage
 
5953
  [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
5954
  Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
5955
 
5956
+ [Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/abs/2402.05672).
5957
+ Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
5958
+
5959
  This model has 12 layers and the embedding size is 384.
5960
 
5961
  ## Usage