Update README.md
Browse files
README.md
CHANGED
@@ -1,9 +1,80 @@
|
|
1 |
---
|
2 |
library_name: peft
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
4 |
-
## Training procedure
|
5 |
|
6 |
-
### Framework versions
|
7 |
|
|
|
8 |
|
9 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
library_name: peft
|
3 |
+
license: mit
|
4 |
+
datasets:
|
5 |
+
- multi_nli
|
6 |
+
- snli
|
7 |
+
language:
|
8 |
+
- en
|
9 |
+
metrics:
|
10 |
+
- spearmanr
|
11 |
---
|
|
|
12 |
|
|
|
13 |
|
14 |
+
# AnglE📐: Angle-optimized Text Embeddings
|
15 |
|
16 |
+
> It is Angle 📐, not Angel 👼.
|
17 |
+
|
18 |
+
🔥 A New SOTA Model for Semantic Textual Similarity!
|
19 |
+
|
20 |
+
Github: https://github.com/SeanLee97/AnglE
|
21 |
+
|
22 |
+
<a href="https://arxiv.org/abs/2309.12871">
|
23 |
+
<img src="https://img.shields.io/badge/Arxiv-2306.06843-yellow.svg?style=flat-square" alt="https://arxiv.org/abs/2309.12871" />
|
24 |
+
</a>
|
25 |
+
|
26 |
+
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sick-r-1?p=angle-optimized-text-embeddings)
|
27 |
+
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts16?p=angle-optimized-text-embeddings)
|
28 |
+
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts15?p=angle-optimized-text-embeddings)
|
29 |
+
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts14?p=angle-optimized-text-embeddings)
|
30 |
+
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts13?p=angle-optimized-text-embeddings)
|
31 |
+
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts12?p=angle-optimized-text-embeddings)
|
32 |
+
[](https://paperswithcode.com/sota/semantic-textual-similarity-on-sts-benchmark?p=angle-optimized-text-embeddings)
|
33 |
+
|
34 |
+
|
35 |
+
**STS Results**
|
36 |
+
|
37 |
+
|
38 |
+
| Model | STS12 | STS13 | STS14 | STS15 | STS16 | STSBenchmark | SICKRelatedness | Avg. |
|
39 |
+
| ------- |-------|-------|-------|-------|-------|--------------|-----------------|-------|
|
40 |
+
| [SeanLee97/angle-llama-7b-nli-20231027](https://huggingface.co/SeanLee97/angle-llama-7b-nli-20231027) | 78.68 | 90.58 | 85.49 | 89.56 | 86.91 | 88.92 | 81.18 | 85.90 |
|
41 |
+
| [SeanLee97/angle-llama-7b-nli-v2](https://huggingface.co/SeanLee97/angle-llama-7b-nli-v2) | 79.00 | 90.56 | 85.79 | 89.43 | 87.00 | 88.97 | 80.94 | **85.96** |
|
42 |
+
|
43 |
+
|
44 |
+
|
45 |
+
## Usage
|
46 |
+
|
47 |
+
```bash
|
48 |
+
python -m pip install -U angle-emb
|
49 |
+
```
|
50 |
+
|
51 |
+
```python
|
52 |
+
from angle_emb import AnglE, Prompts
|
53 |
+
|
54 |
+
# init
|
55 |
+
angle = AnglE.from_pretrained('NousResearch/Llama-2-13b-hf', pretrained_lora_path='SeanLee97/angle-llama-13b-nli', load_kbit=16, apply_bfloat16=False)
|
56 |
+
|
57 |
+
# set prompt
|
58 |
+
print('All predefined prompts:', Prompts.list_prompts())
|
59 |
+
angle.set_prompt(prompt=Prompts.A)
|
60 |
+
print('prompt:', angle.prompt)
|
61 |
+
|
62 |
+
# encode text
|
63 |
+
vec = angle.encode({'text': 'hello world'}, to_numpy=True)
|
64 |
+
print(vec)
|
65 |
+
vecs = angle.encode([{'text': 'hello world1'}, {'text': 'hello world2'}], to_numpy=True)
|
66 |
+
print(vecs)
|
67 |
+
```
|
68 |
+
|
69 |
+
## Citation
|
70 |
+
|
71 |
+
You are welcome to use our code and pre-trained models. If you use our code and pre-trained models, please support us by citing our work as follows:
|
72 |
+
|
73 |
+
```bibtex
|
74 |
+
@article{li2023angle,
|
75 |
+
title={AnglE-Optimized Text Embeddings},
|
76 |
+
author={Li, Xianming and Li, Jing},
|
77 |
+
journal={arXiv preprint arXiv:2309.12871},
|
78 |
+
year={2023}
|
79 |
+
}
|
80 |
+
```
|