intfloat osanseviero HF staff commited on
Commit
87d6695
1 Parent(s): 167cf6a

Add link to new paper (#18)

Browse files

- Add link to new paper (cea84a7cfcd78e04bc1ef6c7182a06fc72a22fbb)


Co-authored-by: Omar Sanseviero <osanseviero@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +3 -0
README.md CHANGED
@@ -6781,6 +6781,9 @@ license: mit
6781
  [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
6782
  Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
6783
 
 
 
 
6784
  This model has 12 layers and the embedding size is 768.
6785
 
6786
  ## Usage
 
6781
  [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
6782
  Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
6783
 
6784
+ [Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/abs/2402.05672).
6785
+ Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
6786
+
6787
  This model has 12 layers and the embedding size is 768.
6788
 
6789
  ## Usage