Update README.md
Browse files
README.md
CHANGED
@@ -114,7 +114,7 @@ language:
|
|
114 |
# nomic-embed-text-v2-moe: Multilingual Mixture of Experts Text Embeddings
|
115 |
|
116 |
## Model Overview
|
117 |
-
`nomic-embed-text-v2-moe` is SoTA multilingual MoE text embedding model:
|
118 |
|
119 |
- **High Performance**: SoTA Multilingual performance compared to ~300M parameter models, competitive with models 2x in size
|
120 |
- **Multilinguality**: Supports ~100 languages and trained on over 1.6B pairs
|
|
|
114 |
# nomic-embed-text-v2-moe: Multilingual Mixture of Experts Text Embeddings
|
115 |
|
116 |
## Model Overview
|
117 |
+
`nomic-embed-text-v2-moe` is SoTA multilingual MoE text embedding model that excels at multilingual retrieval:
|
118 |
|
119 |
- **High Performance**: SoTA Multilingual performance compared to ~300M parameter models, competitive with models 2x in size
|
120 |
- **Multilinguality**: Supports ~100 languages and trained on over 1.6B pairs
|