Update README.md
Browse files
README.md
CHANGED
@@ -2606,8 +2606,7 @@ language:
|
|
2606 |
|
2607 |
# Universal AnglE Embedding
|
2608 |
|
2609 |
-
🔥 Our universal English sentence embedding `WhereIsAI/UAE-Large-V1` achieves SOTA on the [MTEB Leaderboard](https://huggingface.co/spaces/mteb/leaderboard) with an average score of 64.64!
|
2610 |
-
|
2611 |
|
2612 |
Our model is built upon the powerful BGE-Large, enhanced with [AnglE optimizing](https://github.com/SeanLee97/AnglE).
|
2613 |
|
@@ -2620,9 +2619,7 @@ Welcome to follow us on GitHub: https://github.com/SeanLee97/AnglE.
|
|
2620 |
python -m pip install -U angle-emb
|
2621 |
```
|
2622 |
|
2623 |
-
|
2624 |
-
|
2625 |
-
1) Non-Retrieval
|
2626 |
|
2627 |
```python
|
2628 |
from angle_emb import AnglE
|
@@ -2634,7 +2631,7 @@ vecs = angle.encode(['hello world1', 'hello world2'], to_numpy=True)
|
|
2634 |
print(vecs)
|
2635 |
```
|
2636 |
|
2637 |
-
2) Retrieval
|
2638 |
|
2639 |
For retrieval purposes, please use the prompt `Prompts.C`.
|
2640 |
|
@@ -2650,6 +2647,7 @@ print(vecs)
|
|
2650 |
```
|
2651 |
|
2652 |
# Citation
|
|
|
2653 |
If you use our pre-trained models, welcome to support us by citing our work:
|
2654 |
|
2655 |
```
|
|
|
2606 |
|
2607 |
# Universal AnglE Embedding
|
2608 |
|
2609 |
+
🔥 Our universal English sentence embedding `WhereIsAI/UAE-Large-V1` achieves **SOTA** on the [MTEB Leaderboard](https://huggingface.co/spaces/mteb/leaderboard) with an average score of 64.64!
|
|
|
2610 |
|
2611 |
Our model is built upon the powerful BGE-Large, enhanced with [AnglE optimizing](https://github.com/SeanLee97/AnglE).
|
2612 |
|
|
|
2619 |
python -m pip install -U angle-emb
|
2620 |
```
|
2621 |
|
2622 |
+
1) Non-Retrieval Tasks
|
|
|
|
|
2623 |
|
2624 |
```python
|
2625 |
from angle_emb import AnglE
|
|
|
2631 |
print(vecs)
|
2632 |
```
|
2633 |
|
2634 |
+
2) Retrieval Tasks
|
2635 |
|
2636 |
For retrieval purposes, please use the prompt `Prompts.C`.
|
2637 |
|
|
|
2647 |
```
|
2648 |
|
2649 |
# Citation
|
2650 |
+
|
2651 |
If you use our pre-trained models, welcome to support us by citing our work:
|
2652 |
|
2653 |
```
|