bwang0911 commited on
Commit
29a08cb
1 Parent(s): 7ab613f

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -2
README.md CHANGED
@@ -2633,13 +2633,13 @@ This makes our model useful for a range of use cases, especially when processing
2633
  With a standard size of 137 million parameters, the model enables fast inference while delivering better performance than our small model. It is recommended to use a single GPU for inference.
2634
  Additionally, we provide the following embedding models:
2635
 
2636
- *V1 (Based on T5, 512 Seq)*
2637
 
2638
  - [`jina-embeddings-v1-small-en`](https://huggingface.co/jinaai/jina-embedding-s-en-v1): 35 million parameters.
2639
  - [`jina-embeddings-v1-base-en`](https://huggingface.co/jinaai/jina-embedding-b-en-v1): 110 million parameters.
2640
  - [`jina-embeddings-v2-large-en`](https://huggingface.co/jinaai/jina-embedding-l-en-v1): 330 million parameters.
2641
 
2642
- *V2 (Based on JinaBert, 8k Seq)*
2643
 
2644
  - [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters **(you are here)**.
2645
  - [`jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en): 137 million parameters.
@@ -2704,4 +2704,13 @@ Embeddings for Large Documents},
2704
  archivePrefix={arXiv},
2705
  primaryClass={cs.CL}
2706
  }
 
 
 
 
 
 
 
 
 
2707
  ```
 
2633
  With a standard size of 137 million parameters, the model enables fast inference while delivering better performance than our small model. It is recommended to use a single GPU for inference.
2634
  Additionally, we provide the following embedding models:
2635
 
2636
+ **V1 (Based on T5, 512 Seq)**
2637
 
2638
  - [`jina-embeddings-v1-small-en`](https://huggingface.co/jinaai/jina-embedding-s-en-v1): 35 million parameters.
2639
  - [`jina-embeddings-v1-base-en`](https://huggingface.co/jinaai/jina-embedding-b-en-v1): 110 million parameters.
2640
  - [`jina-embeddings-v2-large-en`](https://huggingface.co/jinaai/jina-embedding-l-en-v1): 330 million parameters.
2641
 
2642
+ **V2 (Based on JinaBert, 8k Seq)**
2643
 
2644
  - [`jina-embeddings-v2-small-en`](https://huggingface.co/jinaai/jina-embeddings-v2-small-en): 33 million parameters **(you are here)**.
2645
  - [`jina-embeddings-v2-base-en`](https://huggingface.co/jinaai/jina-embeddings-v2-base-en): 137 million parameters.
 
2704
  archivePrefix={arXiv},
2705
  primaryClass={cs.CL}
2706
  }
2707
+
2708
+ @misc{günther2023jina,
2709
+ title={Jina Embeddings: A Novel Set of High-Performance Sentence Embedding Models},
2710
+ author={Michael Günther and Louis Milliken and Jonathan Geuter and Georgios Mastrapas and Bo Wang and Han Xiao},
2711
+ year={2023},
2712
+ eprint={2307.11224},
2713
+ archivePrefix={arXiv},
2714
+ primaryClass={cs.CL}
2715
+ }
2716
  ```