Commit
•
bbc53bb
1
Parent(s):
9f78368
Add link to new paper (#29)
Browse files- Add link to new paper (fdf3dc539da8c22662dd7fef0af417f370db3efd)
Co-authored-by: Omar Sanseviero <osanseviero@users.noreply.huggingface.co>
README.md
CHANGED
@@ -5954,6 +5954,9 @@ license: mit
|
|
5954 |
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
|
5955 |
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
|
5956 |
|
|
|
|
|
|
|
5957 |
This model has 24 layers and the embedding size is 1024.
|
5958 |
|
5959 |
## Usage
|
|
|
5954 |
[Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
|
5955 |
Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
|
5956 |
|
5957 |
+
[Multilingual E5 Text Embeddings: A Technical Report](https://arxiv.org/abs/2402.05672).
|
5958 |
+
Liang Wang, Nan Yang, Xiaolong Huang, Linjun Yang, Rangan Majumder, Furu Wei, arXiv 2024
|
5959 |
+
|
5960 |
This model has 24 layers and the embedding size is 1024.
|
5961 |
|
5962 |
## Usage
|