bwang0911 commited on
Commit
c0fa0ce
1 Parent(s): 34234f6

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -2628,7 +2628,7 @@ The model is further trained on Jina AI's collection of more than 400 millions o
2628
  These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
2629
 
2630
  The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
2631
- This makes our model useful for a range of use cases, especially when processing long documents is needed, including long document retrieval, semantic textual similarity, text reranking, recommendation, RAG and LLM-based generative search,...
2632
 
2633
  With a standard size of 137 million parameters, the model enables fast inference while delivering better performance than our small model. It is recommended to use a single GPU for inference.
2634
  Additionally, we provide the following embedding models:
 
2628
  These pairs were obtained from various domains and were carefully selected through a thorough cleaning process.
2629
 
2630
  The embedding model was trained using 512 sequence length, but extrapolates to 8k sequence length (or even longer) thanks to ALiBi.
2631
+ This makes our model useful for a range of use cases, especially when processing long documents is needed, including long document retrieval, semantic textual similarity, text reranking, recommendation, RAG and LLM-based generative search, etc.
2632
 
2633
  With a standard size of 137 million parameters, the model enables fast inference while delivering better performance than our small model. It is recommended to use a single GPU for inference.
2634
  Additionally, we provide the following embedding models: