bwang0911 commited on
Commit
a7ee9f2
1 Parent(s): 3ea4a9e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -2627,7 +2627,7 @@ The model is further trained on Jina AI's collection of more than 400 millions o
2627
  These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
2628
 
2629
  The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
2630
- This makes our model useful for a range of use cases, especially when processing long documents is needed, including long document retrieval, semantic textual similarity, text reranking, recommendation, RAG and LLM-based generative search,...
2631
 
2632
  This model has 33 million parameters, which enables lightning-fast and memory efficient inference, while still delivering impressive performance.
2633
  Additionally, we provide the following embedding models:
 
2627
  These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
2628
 
2629
  The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
2630
+ This makes our model useful for a range of use cases, especially when processing long documents is needed, including long document retrieval, semantic textual similarity, text reranking, recommendation, RAG and LLM-based generative search, etc.
2631
 
2632
  This model has 33 million parameters, which enables lightning-fast and memory efficient inference, while still delivering impressive performance.
2633
  Additionally, we provide the following embedding models: