anttip michaelfeil commited on
Commit
476dfe9
0 Parent(s):

Duplicate from michaelfeil/ct2fast-e5-small-v2

Browse files

Co-authored-by: Michael <michaelfeil@users.noreply.huggingface.co>

.gitattributes ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tflite filter=lfs diff=lfs merge=lfs -text
29
+ *.tgz filter=lfs diff=lfs merge=lfs -text
30
+ *.wasm filter=lfs diff=lfs merge=lfs -text
31
+ *.xz filter=lfs diff=lfs merge=lfs -text
32
+ *.zip filter=lfs diff=lfs merge=lfs -text
33
+ *.zst filter=lfs diff=lfs merge=lfs -text
34
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,2750 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - ctranslate2
4
+ - int8
5
+ - float16
6
+ - mteb
7
+ model-index:
8
+ - name: e5-small-v2
9
+ results:
10
+ - task:
11
+ type: Classification
12
+ dataset:
13
+ type: mteb/amazon_counterfactual
14
+ name: MTEB AmazonCounterfactualClassification (en)
15
+ config: en
16
+ split: test
17
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
18
+ metrics:
19
+ - type: accuracy
20
+ value: 77.59701492537313
21
+ - type: ap
22
+ value: 41.67064885731708
23
+ - type: f1
24
+ value: 71.86465946398573
25
+ - task:
26
+ type: Classification
27
+ dataset:
28
+ type: mteb/amazon_polarity
29
+ name: MTEB AmazonPolarityClassification
30
+ config: default
31
+ split: test
32
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
33
+ metrics:
34
+ - type: accuracy
35
+ value: 91.265875
36
+ - type: ap
37
+ value: 87.67633085349644
38
+ - type: f1
39
+ value: 91.24297521425744
40
+ - task:
41
+ type: Classification
42
+ dataset:
43
+ type: mteb/amazon_reviews_multi
44
+ name: MTEB AmazonReviewsClassification (en)
45
+ config: en
46
+ split: test
47
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
48
+ metrics:
49
+ - type: accuracy
50
+ value: 45.882000000000005
51
+ - type: f1
52
+ value: 45.08058870381236
53
+ - task:
54
+ type: Retrieval
55
+ dataset:
56
+ type: arguana
57
+ name: MTEB ArguAna
58
+ config: default
59
+ split: test
60
+ revision: None
61
+ metrics:
62
+ - type: map_at_1
63
+ value: 20.697
64
+ - type: map_at_10
65
+ value: 33.975
66
+ - type: map_at_100
67
+ value: 35.223
68
+ - type: map_at_1000
69
+ value: 35.260000000000005
70
+ - type: map_at_3
71
+ value: 29.776999999999997
72
+ - type: map_at_5
73
+ value: 32.035000000000004
74
+ - type: mrr_at_1
75
+ value: 20.982
76
+ - type: mrr_at_10
77
+ value: 34.094
78
+ - type: mrr_at_100
79
+ value: 35.343
80
+ - type: mrr_at_1000
81
+ value: 35.38
82
+ - type: mrr_at_3
83
+ value: 29.884
84
+ - type: mrr_at_5
85
+ value: 32.141999999999996
86
+ - type: ndcg_at_1
87
+ value: 20.697
88
+ - type: ndcg_at_10
89
+ value: 41.668
90
+ - type: ndcg_at_100
91
+ value: 47.397
92
+ - type: ndcg_at_1000
93
+ value: 48.305
94
+ - type: ndcg_at_3
95
+ value: 32.928000000000004
96
+ - type: ndcg_at_5
97
+ value: 36.998999999999995
98
+ - type: precision_at_1
99
+ value: 20.697
100
+ - type: precision_at_10
101
+ value: 6.636
102
+ - type: precision_at_100
103
+ value: 0.924
104
+ - type: precision_at_1000
105
+ value: 0.099
106
+ - type: precision_at_3
107
+ value: 14.035
108
+ - type: precision_at_5
109
+ value: 10.398
110
+ - type: recall_at_1
111
+ value: 20.697
112
+ - type: recall_at_10
113
+ value: 66.35799999999999
114
+ - type: recall_at_100
115
+ value: 92.39
116
+ - type: recall_at_1000
117
+ value: 99.36
118
+ - type: recall_at_3
119
+ value: 42.105
120
+ - type: recall_at_5
121
+ value: 51.991
122
+ - task:
123
+ type: Clustering
124
+ dataset:
125
+ type: mteb/arxiv-clustering-p2p
126
+ name: MTEB ArxivClusteringP2P
127
+ config: default
128
+ split: test
129
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
130
+ metrics:
131
+ - type: v_measure
132
+ value: 42.1169517447068
133
+ - task:
134
+ type: Clustering
135
+ dataset:
136
+ type: mteb/arxiv-clustering-s2s
137
+ name: MTEB ArxivClusteringS2S
138
+ config: default
139
+ split: test
140
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
141
+ metrics:
142
+ - type: v_measure
143
+ value: 34.79553720107097
144
+ - task:
145
+ type: Reranking
146
+ dataset:
147
+ type: mteb/askubuntudupquestions-reranking
148
+ name: MTEB AskUbuntuDupQuestions
149
+ config: default
150
+ split: test
151
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
152
+ metrics:
153
+ - type: map
154
+ value: 58.10811337308168
155
+ - type: mrr
156
+ value: 71.56410763751482
157
+ - task:
158
+ type: STS
159
+ dataset:
160
+ type: mteb/biosses-sts
161
+ name: MTEB BIOSSES
162
+ config: default
163
+ split: test
164
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
165
+ metrics:
166
+ - type: cos_sim_pearson
167
+ value: 78.46834918248696
168
+ - type: cos_sim_spearman
169
+ value: 79.4289182755206
170
+ - type: euclidean_pearson
171
+ value: 76.26662973727008
172
+ - type: euclidean_spearman
173
+ value: 78.11744260952536
174
+ - type: manhattan_pearson
175
+ value: 76.08175262609434
176
+ - type: manhattan_spearman
177
+ value: 78.29395265552289
178
+ - task:
179
+ type: Classification
180
+ dataset:
181
+ type: mteb/banking77
182
+ name: MTEB Banking77Classification
183
+ config: default
184
+ split: test
185
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
186
+ metrics:
187
+ - type: accuracy
188
+ value: 81.63636363636364
189
+ - type: f1
190
+ value: 81.55779952376953
191
+ - task:
192
+ type: Clustering
193
+ dataset:
194
+ type: mteb/biorxiv-clustering-p2p
195
+ name: MTEB BiorxivClusteringP2P
196
+ config: default
197
+ split: test
198
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
199
+ metrics:
200
+ - type: v_measure
201
+ value: 35.88541137137571
202
+ - task:
203
+ type: Clustering
204
+ dataset:
205
+ type: mteb/biorxiv-clustering-s2s
206
+ name: MTEB BiorxivClusteringS2S
207
+ config: default
208
+ split: test
209
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
210
+ metrics:
211
+ - type: v_measure
212
+ value: 30.05205685274407
213
+ - task:
214
+ type: Retrieval
215
+ dataset:
216
+ type: BeIR/cqadupstack
217
+ name: MTEB CQADupstackAndroidRetrieval
218
+ config: default
219
+ split: test
220
+ revision: None
221
+ metrics:
222
+ - type: map_at_1
223
+ value: 30.293999999999997
224
+ - type: map_at_10
225
+ value: 39.876
226
+ - type: map_at_100
227
+ value: 41.315000000000005
228
+ - type: map_at_1000
229
+ value: 41.451
230
+ - type: map_at_3
231
+ value: 37.194
232
+ - type: map_at_5
233
+ value: 38.728
234
+ - type: mrr_at_1
235
+ value: 37.053000000000004
236
+ - type: mrr_at_10
237
+ value: 45.281
238
+ - type: mrr_at_100
239
+ value: 46.188
240
+ - type: mrr_at_1000
241
+ value: 46.245999999999995
242
+ - type: mrr_at_3
243
+ value: 43.228
244
+ - type: mrr_at_5
245
+ value: 44.366
246
+ - type: ndcg_at_1
247
+ value: 37.053000000000004
248
+ - type: ndcg_at_10
249
+ value: 45.086
250
+ - type: ndcg_at_100
251
+ value: 50.756
252
+ - type: ndcg_at_1000
253
+ value: 53.123
254
+ - type: ndcg_at_3
255
+ value: 41.416
256
+ - type: ndcg_at_5
257
+ value: 43.098
258
+ - type: precision_at_1
259
+ value: 37.053000000000004
260
+ - type: precision_at_10
261
+ value: 8.34
262
+ - type: precision_at_100
263
+ value: 1.346
264
+ - type: precision_at_1000
265
+ value: 0.186
266
+ - type: precision_at_3
267
+ value: 19.647000000000002
268
+ - type: precision_at_5
269
+ value: 13.877
270
+ - type: recall_at_1
271
+ value: 30.293999999999997
272
+ - type: recall_at_10
273
+ value: 54.309
274
+ - type: recall_at_100
275
+ value: 78.59
276
+ - type: recall_at_1000
277
+ value: 93.82300000000001
278
+ - type: recall_at_3
279
+ value: 43.168
280
+ - type: recall_at_5
281
+ value: 48.192
282
+ - task:
283
+ type: Retrieval
284
+ dataset:
285
+ type: BeIR/cqadupstack
286
+ name: MTEB CQADupstackEnglishRetrieval
287
+ config: default
288
+ split: test
289
+ revision: None
290
+ metrics:
291
+ - type: map_at_1
292
+ value: 28.738000000000003
293
+ - type: map_at_10
294
+ value: 36.925999999999995
295
+ - type: map_at_100
296
+ value: 38.017
297
+ - type: map_at_1000
298
+ value: 38.144
299
+ - type: map_at_3
300
+ value: 34.446
301
+ - type: map_at_5
302
+ value: 35.704
303
+ - type: mrr_at_1
304
+ value: 35.478
305
+ - type: mrr_at_10
306
+ value: 42.786
307
+ - type: mrr_at_100
308
+ value: 43.458999999999996
309
+ - type: mrr_at_1000
310
+ value: 43.507
311
+ - type: mrr_at_3
312
+ value: 40.648
313
+ - type: mrr_at_5
314
+ value: 41.804
315
+ - type: ndcg_at_1
316
+ value: 35.478
317
+ - type: ndcg_at_10
318
+ value: 42.044
319
+ - type: ndcg_at_100
320
+ value: 46.249
321
+ - type: ndcg_at_1000
322
+ value: 48.44
323
+ - type: ndcg_at_3
324
+ value: 38.314
325
+ - type: ndcg_at_5
326
+ value: 39.798
327
+ - type: precision_at_1
328
+ value: 35.478
329
+ - type: precision_at_10
330
+ value: 7.764
331
+ - type: precision_at_100
332
+ value: 1.253
333
+ - type: precision_at_1000
334
+ value: 0.174
335
+ - type: precision_at_3
336
+ value: 18.047
337
+ - type: precision_at_5
338
+ value: 12.637
339
+ - type: recall_at_1
340
+ value: 28.738000000000003
341
+ - type: recall_at_10
342
+ value: 50.659
343
+ - type: recall_at_100
344
+ value: 68.76299999999999
345
+ - type: recall_at_1000
346
+ value: 82.811
347
+ - type: recall_at_3
348
+ value: 39.536
349
+ - type: recall_at_5
350
+ value: 43.763999999999996
351
+ - task:
352
+ type: Retrieval
353
+ dataset:
354
+ type: BeIR/cqadupstack
355
+ name: MTEB CQADupstackGamingRetrieval
356
+ config: default
357
+ split: test
358
+ revision: None
359
+ metrics:
360
+ - type: map_at_1
361
+ value: 38.565
362
+ - type: map_at_10
363
+ value: 50.168
364
+ - type: map_at_100
365
+ value: 51.11
366
+ - type: map_at_1000
367
+ value: 51.173
368
+ - type: map_at_3
369
+ value: 47.044000000000004
370
+ - type: map_at_5
371
+ value: 48.838
372
+ - type: mrr_at_1
373
+ value: 44.201
374
+ - type: mrr_at_10
375
+ value: 53.596999999999994
376
+ - type: mrr_at_100
377
+ value: 54.211
378
+ - type: mrr_at_1000
379
+ value: 54.247
380
+ - type: mrr_at_3
381
+ value: 51.202000000000005
382
+ - type: mrr_at_5
383
+ value: 52.608999999999995
384
+ - type: ndcg_at_1
385
+ value: 44.201
386
+ - type: ndcg_at_10
387
+ value: 55.694
388
+ - type: ndcg_at_100
389
+ value: 59.518
390
+ - type: ndcg_at_1000
391
+ value: 60.907
392
+ - type: ndcg_at_3
393
+ value: 50.395999999999994
394
+ - type: ndcg_at_5
395
+ value: 53.022999999999996
396
+ - type: precision_at_1
397
+ value: 44.201
398
+ - type: precision_at_10
399
+ value: 8.84
400
+ - type: precision_at_100
401
+ value: 1.162
402
+ - type: precision_at_1000
403
+ value: 0.133
404
+ - type: precision_at_3
405
+ value: 22.153
406
+ - type: precision_at_5
407
+ value: 15.260000000000002
408
+ - type: recall_at_1
409
+ value: 38.565
410
+ - type: recall_at_10
411
+ value: 68.65
412
+ - type: recall_at_100
413
+ value: 85.37400000000001
414
+ - type: recall_at_1000
415
+ value: 95.37400000000001
416
+ - type: recall_at_3
417
+ value: 54.645999999999994
418
+ - type: recall_at_5
419
+ value: 60.958
420
+ - task:
421
+ type: Retrieval
422
+ dataset:
423
+ type: BeIR/cqadupstack
424
+ name: MTEB CQADupstackGisRetrieval
425
+ config: default
426
+ split: test
427
+ revision: None
428
+ metrics:
429
+ - type: map_at_1
430
+ value: 23.945
431
+ - type: map_at_10
432
+ value: 30.641000000000002
433
+ - type: map_at_100
434
+ value: 31.599
435
+ - type: map_at_1000
436
+ value: 31.691000000000003
437
+ - type: map_at_3
438
+ value: 28.405
439
+ - type: map_at_5
440
+ value: 29.704000000000004
441
+ - type: mrr_at_1
442
+ value: 25.537
443
+ - type: mrr_at_10
444
+ value: 32.22
445
+ - type: mrr_at_100
446
+ value: 33.138
447
+ - type: mrr_at_1000
448
+ value: 33.214
449
+ - type: mrr_at_3
450
+ value: 30.151
451
+ - type: mrr_at_5
452
+ value: 31.298
453
+ - type: ndcg_at_1
454
+ value: 25.537
455
+ - type: ndcg_at_10
456
+ value: 34.638000000000005
457
+ - type: ndcg_at_100
458
+ value: 39.486
459
+ - type: ndcg_at_1000
460
+ value: 41.936
461
+ - type: ndcg_at_3
462
+ value: 30.333
463
+ - type: ndcg_at_5
464
+ value: 32.482
465
+ - type: precision_at_1
466
+ value: 25.537
467
+ - type: precision_at_10
468
+ value: 5.153
469
+ - type: precision_at_100
470
+ value: 0.7929999999999999
471
+ - type: precision_at_1000
472
+ value: 0.104
473
+ - type: precision_at_3
474
+ value: 12.429
475
+ - type: precision_at_5
476
+ value: 8.723
477
+ - type: recall_at_1
478
+ value: 23.945
479
+ - type: recall_at_10
480
+ value: 45.412
481
+ - type: recall_at_100
482
+ value: 67.836
483
+ - type: recall_at_1000
484
+ value: 86.467
485
+ - type: recall_at_3
486
+ value: 34.031
487
+ - type: recall_at_5
488
+ value: 39.039
489
+ - task:
490
+ type: Retrieval
491
+ dataset:
492
+ type: BeIR/cqadupstack
493
+ name: MTEB CQADupstackMathematicaRetrieval
494
+ config: default
495
+ split: test
496
+ revision: None
497
+ metrics:
498
+ - type: map_at_1
499
+ value: 14.419
500
+ - type: map_at_10
501
+ value: 20.858999999999998
502
+ - type: map_at_100
503
+ value: 22.067999999999998
504
+ - type: map_at_1000
505
+ value: 22.192
506
+ - type: map_at_3
507
+ value: 18.673000000000002
508
+ - type: map_at_5
509
+ value: 19.968
510
+ - type: mrr_at_1
511
+ value: 17.785999999999998
512
+ - type: mrr_at_10
513
+ value: 24.878
514
+ - type: mrr_at_100
515
+ value: 26.021
516
+ - type: mrr_at_1000
517
+ value: 26.095000000000002
518
+ - type: mrr_at_3
519
+ value: 22.616
520
+ - type: mrr_at_5
521
+ value: 23.785
522
+ - type: ndcg_at_1
523
+ value: 17.785999999999998
524
+ - type: ndcg_at_10
525
+ value: 25.153
526
+ - type: ndcg_at_100
527
+ value: 31.05
528
+ - type: ndcg_at_1000
529
+ value: 34.052
530
+ - type: ndcg_at_3
531
+ value: 21.117
532
+ - type: ndcg_at_5
533
+ value: 23.048
534
+ - type: precision_at_1
535
+ value: 17.785999999999998
536
+ - type: precision_at_10
537
+ value: 4.590000000000001
538
+ - type: precision_at_100
539
+ value: 0.864
540
+ - type: precision_at_1000
541
+ value: 0.125
542
+ - type: precision_at_3
543
+ value: 9.908999999999999
544
+ - type: precision_at_5
545
+ value: 7.313
546
+ - type: recall_at_1
547
+ value: 14.419
548
+ - type: recall_at_10
549
+ value: 34.477999999999994
550
+ - type: recall_at_100
551
+ value: 60.02499999999999
552
+ - type: recall_at_1000
553
+ value: 81.646
554
+ - type: recall_at_3
555
+ value: 23.515
556
+ - type: recall_at_5
557
+ value: 28.266999999999996
558
+ - task:
559
+ type: Retrieval
560
+ dataset:
561
+ type: BeIR/cqadupstack
562
+ name: MTEB CQADupstackPhysicsRetrieval
563
+ config: default
564
+ split: test
565
+ revision: None
566
+ metrics:
567
+ - type: map_at_1
568
+ value: 26.268
569
+ - type: map_at_10
570
+ value: 35.114000000000004
571
+ - type: map_at_100
572
+ value: 36.212
573
+ - type: map_at_1000
574
+ value: 36.333
575
+ - type: map_at_3
576
+ value: 32.436
577
+ - type: map_at_5
578
+ value: 33.992
579
+ - type: mrr_at_1
580
+ value: 31.761
581
+ - type: mrr_at_10
582
+ value: 40.355999999999995
583
+ - type: mrr_at_100
584
+ value: 41.125
585
+ - type: mrr_at_1000
586
+ value: 41.186
587
+ - type: mrr_at_3
588
+ value: 37.937
589
+ - type: mrr_at_5
590
+ value: 39.463
591
+ - type: ndcg_at_1
592
+ value: 31.761
593
+ - type: ndcg_at_10
594
+ value: 40.422000000000004
595
+ - type: ndcg_at_100
596
+ value: 45.458999999999996
597
+ - type: ndcg_at_1000
598
+ value: 47.951
599
+ - type: ndcg_at_3
600
+ value: 35.972
601
+ - type: ndcg_at_5
602
+ value: 38.272
603
+ - type: precision_at_1
604
+ value: 31.761
605
+ - type: precision_at_10
606
+ value: 7.103
607
+ - type: precision_at_100
608
+ value: 1.133
609
+ - type: precision_at_1000
610
+ value: 0.152
611
+ - type: precision_at_3
612
+ value: 16.779
613
+ - type: precision_at_5
614
+ value: 11.877
615
+ - type: recall_at_1
616
+ value: 26.268
617
+ - type: recall_at_10
618
+ value: 51.053000000000004
619
+ - type: recall_at_100
620
+ value: 72.702
621
+ - type: recall_at_1000
622
+ value: 89.521
623
+ - type: recall_at_3
624
+ value: 38.619
625
+ - type: recall_at_5
626
+ value: 44.671
627
+ - task:
628
+ type: Retrieval
629
+ dataset:
630
+ type: BeIR/cqadupstack
631
+ name: MTEB CQADupstackProgrammersRetrieval
632
+ config: default
633
+ split: test
634
+ revision: None
635
+ metrics:
636
+ - type: map_at_1
637
+ value: 25.230999999999998
638
+ - type: map_at_10
639
+ value: 34.227000000000004
640
+ - type: map_at_100
641
+ value: 35.370000000000005
642
+ - type: map_at_1000
643
+ value: 35.488
644
+ - type: map_at_3
645
+ value: 31.496000000000002
646
+ - type: map_at_5
647
+ value: 33.034
648
+ - type: mrr_at_1
649
+ value: 30.822
650
+ - type: mrr_at_10
651
+ value: 39.045
652
+ - type: mrr_at_100
653
+ value: 39.809
654
+ - type: mrr_at_1000
655
+ value: 39.873
656
+ - type: mrr_at_3
657
+ value: 36.663000000000004
658
+ - type: mrr_at_5
659
+ value: 37.964
660
+ - type: ndcg_at_1
661
+ value: 30.822
662
+ - type: ndcg_at_10
663
+ value: 39.472
664
+ - type: ndcg_at_100
665
+ value: 44.574999999999996
666
+ - type: ndcg_at_1000
667
+ value: 47.162
668
+ - type: ndcg_at_3
669
+ value: 34.929
670
+ - type: ndcg_at_5
671
+ value: 37.002
672
+ - type: precision_at_1
673
+ value: 30.822
674
+ - type: precision_at_10
675
+ value: 7.055
676
+ - type: precision_at_100
677
+ value: 1.124
678
+ - type: precision_at_1000
679
+ value: 0.152
680
+ - type: precision_at_3
681
+ value: 16.591
682
+ - type: precision_at_5
683
+ value: 11.667
684
+ - type: recall_at_1
685
+ value: 25.230999999999998
686
+ - type: recall_at_10
687
+ value: 50.42100000000001
688
+ - type: recall_at_100
689
+ value: 72.685
690
+ - type: recall_at_1000
691
+ value: 90.469
692
+ - type: recall_at_3
693
+ value: 37.503
694
+ - type: recall_at_5
695
+ value: 43.123
696
+ - task:
697
+ type: Retrieval
698
+ dataset:
699
+ type: BeIR/cqadupstack
700
+ name: MTEB CQADupstackRetrieval
701
+ config: default
702
+ split: test
703
+ revision: None
704
+ metrics:
705
+ - type: map_at_1
706
+ value: 24.604166666666664
707
+ - type: map_at_10
708
+ value: 32.427166666666665
709
+ - type: map_at_100
710
+ value: 33.51474999999999
711
+ - type: map_at_1000
712
+ value: 33.6345
713
+ - type: map_at_3
714
+ value: 30.02366666666667
715
+ - type: map_at_5
716
+ value: 31.382333333333328
717
+ - type: mrr_at_1
718
+ value: 29.001166666666666
719
+ - type: mrr_at_10
720
+ value: 36.3315
721
+ - type: mrr_at_100
722
+ value: 37.16683333333333
723
+ - type: mrr_at_1000
724
+ value: 37.23341666666668
725
+ - type: mrr_at_3
726
+ value: 34.19916666666667
727
+ - type: mrr_at_5
728
+ value: 35.40458333333334
729
+ - type: ndcg_at_1
730
+ value: 29.001166666666666
731
+ - type: ndcg_at_10
732
+ value: 37.06883333333334
733
+ - type: ndcg_at_100
734
+ value: 41.95816666666666
735
+ - type: ndcg_at_1000
736
+ value: 44.501583333333336
737
+ - type: ndcg_at_3
738
+ value: 32.973499999999994
739
+ - type: ndcg_at_5
740
+ value: 34.90833333333334
741
+ - type: precision_at_1
742
+ value: 29.001166666666666
743
+ - type: precision_at_10
744
+ value: 6.336
745
+ - type: precision_at_100
746
+ value: 1.0282499999999999
747
+ - type: precision_at_1000
748
+ value: 0.14391666666666664
749
+ - type: precision_at_3
750
+ value: 14.932499999999996
751
+ - type: precision_at_5
752
+ value: 10.50825
753
+ - type: recall_at_1
754
+ value: 24.604166666666664
755
+ - type: recall_at_10
756
+ value: 46.9525
757
+ - type: recall_at_100
758
+ value: 68.67816666666667
759
+ - type: recall_at_1000
760
+ value: 86.59783333333334
761
+ - type: recall_at_3
762
+ value: 35.49783333333333
763
+ - type: recall_at_5
764
+ value: 40.52525000000001
765
+ - task:
766
+ type: Retrieval
767
+ dataset:
768
+ type: BeIR/cqadupstack
769
+ name: MTEB CQADupstackStatsRetrieval
770
+ config: default
771
+ split: test
772
+ revision: None
773
+ metrics:
774
+ - type: map_at_1
775
+ value: 23.559
776
+ - type: map_at_10
777
+ value: 29.023
778
+ - type: map_at_100
779
+ value: 29.818
780
+ - type: map_at_1000
781
+ value: 29.909000000000002
782
+ - type: map_at_3
783
+ value: 27.037
784
+ - type: map_at_5
785
+ value: 28.225
786
+ - type: mrr_at_1
787
+ value: 26.994
788
+ - type: mrr_at_10
789
+ value: 31.962000000000003
790
+ - type: mrr_at_100
791
+ value: 32.726
792
+ - type: mrr_at_1000
793
+ value: 32.800000000000004
794
+ - type: mrr_at_3
795
+ value: 30.266
796
+ - type: mrr_at_5
797
+ value: 31.208999999999996
798
+ - type: ndcg_at_1
799
+ value: 26.994
800
+ - type: ndcg_at_10
801
+ value: 32.53
802
+ - type: ndcg_at_100
803
+ value: 36.758
804
+ - type: ndcg_at_1000
805
+ value: 39.362
806
+ - type: ndcg_at_3
807
+ value: 28.985
808
+ - type: ndcg_at_5
809
+ value: 30.757
810
+ - type: precision_at_1
811
+ value: 26.994
812
+ - type: precision_at_10
813
+ value: 4.968999999999999
814
+ - type: precision_at_100
815
+ value: 0.759
816
+ - type: precision_at_1000
817
+ value: 0.106
818
+ - type: precision_at_3
819
+ value: 12.219
820
+ - type: precision_at_5
821
+ value: 8.527999999999999
822
+ - type: recall_at_1
823
+ value: 23.559
824
+ - type: recall_at_10
825
+ value: 40.585
826
+ - type: recall_at_100
827
+ value: 60.306000000000004
828
+ - type: recall_at_1000
829
+ value: 80.11
830
+ - type: recall_at_3
831
+ value: 30.794
832
+ - type: recall_at_5
833
+ value: 35.186
834
+ - task:
835
+ type: Retrieval
836
+ dataset:
837
+ type: BeIR/cqadupstack
838
+ name: MTEB CQADupstackTexRetrieval
839
+ config: default
840
+ split: test
841
+ revision: None
842
+ metrics:
843
+ - type: map_at_1
844
+ value: 16.384999999999998
845
+ - type: map_at_10
846
+ value: 22.142
847
+ - type: map_at_100
848
+ value: 23.057
849
+ - type: map_at_1000
850
+ value: 23.177
851
+ - type: map_at_3
852
+ value: 20.29
853
+ - type: map_at_5
854
+ value: 21.332
855
+ - type: mrr_at_1
856
+ value: 19.89
857
+ - type: mrr_at_10
858
+ value: 25.771
859
+ - type: mrr_at_100
860
+ value: 26.599
861
+ - type: mrr_at_1000
862
+ value: 26.680999999999997
863
+ - type: mrr_at_3
864
+ value: 23.962
865
+ - type: mrr_at_5
866
+ value: 24.934
867
+ - type: ndcg_at_1
868
+ value: 19.89
869
+ - type: ndcg_at_10
870
+ value: 25.97
871
+ - type: ndcg_at_100
872
+ value: 30.605
873
+ - type: ndcg_at_1000
874
+ value: 33.619
875
+ - type: ndcg_at_3
876
+ value: 22.704
877
+ - type: ndcg_at_5
878
+ value: 24.199
879
+ - type: precision_at_1
880
+ value: 19.89
881
+ - type: precision_at_10
882
+ value: 4.553
883
+ - type: precision_at_100
884
+ value: 0.8049999999999999
885
+ - type: precision_at_1000
886
+ value: 0.122
887
+ - type: precision_at_3
888
+ value: 10.541
889
+ - type: precision_at_5
890
+ value: 7.46
891
+ - type: recall_at_1
892
+ value: 16.384999999999998
893
+ - type: recall_at_10
894
+ value: 34.001
895
+ - type: recall_at_100
896
+ value: 55.17100000000001
897
+ - type: recall_at_1000
898
+ value: 77.125
899
+ - type: recall_at_3
900
+ value: 24.618000000000002
901
+ - type: recall_at_5
902
+ value: 28.695999999999998
903
+ - task:
904
+ type: Retrieval
905
+ dataset:
906
+ type: BeIR/cqadupstack
907
+ name: MTEB CQADupstackUnixRetrieval
908
+ config: default
909
+ split: test
910
+ revision: None
911
+ metrics:
912
+ - type: map_at_1
913
+ value: 23.726
914
+ - type: map_at_10
915
+ value: 31.227
916
+ - type: map_at_100
917
+ value: 32.311
918
+ - type: map_at_1000
919
+ value: 32.419
920
+ - type: map_at_3
921
+ value: 28.765
922
+ - type: map_at_5
923
+ value: 30.229
924
+ - type: mrr_at_1
925
+ value: 27.705000000000002
926
+ - type: mrr_at_10
927
+ value: 35.085
928
+ - type: mrr_at_100
929
+ value: 35.931000000000004
930
+ - type: mrr_at_1000
931
+ value: 36
932
+ - type: mrr_at_3
933
+ value: 32.603
934
+ - type: mrr_at_5
935
+ value: 34.117999999999995
936
+ - type: ndcg_at_1
937
+ value: 27.705000000000002
938
+ - type: ndcg_at_10
939
+ value: 35.968
940
+ - type: ndcg_at_100
941
+ value: 41.197
942
+ - type: ndcg_at_1000
943
+ value: 43.76
944
+ - type: ndcg_at_3
945
+ value: 31.304
946
+ - type: ndcg_at_5
947
+ value: 33.661
948
+ - type: precision_at_1
949
+ value: 27.705000000000002
950
+ - type: precision_at_10
951
+ value: 5.942
952
+ - type: precision_at_100
953
+ value: 0.964
954
+ - type: precision_at_1000
955
+ value: 0.13
956
+ - type: precision_at_3
957
+ value: 13.868
958
+ - type: precision_at_5
959
+ value: 9.944
960
+ - type: recall_at_1
961
+ value: 23.726
962
+ - type: recall_at_10
963
+ value: 46.786
964
+ - type: recall_at_100
965
+ value: 70.072
966
+ - type: recall_at_1000
967
+ value: 88.2
968
+ - type: recall_at_3
969
+ value: 33.981
970
+ - type: recall_at_5
971
+ value: 39.893
972
+ - task:
973
+ type: Retrieval
974
+ dataset:
975
+ type: BeIR/cqadupstack
976
+ name: MTEB CQADupstackWebmastersRetrieval
977
+ config: default
978
+ split: test
979
+ revision: None
980
+ metrics:
981
+ - type: map_at_1
982
+ value: 23.344
983
+ - type: map_at_10
984
+ value: 31.636999999999997
985
+ - type: map_at_100
986
+ value: 33.065
987
+ - type: map_at_1000
988
+ value: 33.300000000000004
989
+ - type: map_at_3
990
+ value: 29.351
991
+ - type: map_at_5
992
+ value: 30.432
993
+ - type: mrr_at_1
994
+ value: 27.866000000000003
995
+ - type: mrr_at_10
996
+ value: 35.587
997
+ - type: mrr_at_100
998
+ value: 36.52
999
+ - type: mrr_at_1000
1000
+ value: 36.597
1001
+ - type: mrr_at_3
1002
+ value: 33.696
1003
+ - type: mrr_at_5
1004
+ value: 34.713
1005
+ - type: ndcg_at_1
1006
+ value: 27.866000000000003
1007
+ - type: ndcg_at_10
1008
+ value: 36.61
1009
+ - type: ndcg_at_100
1010
+ value: 41.88
1011
+ - type: ndcg_at_1000
1012
+ value: 45.105000000000004
1013
+ - type: ndcg_at_3
1014
+ value: 33.038000000000004
1015
+ - type: ndcg_at_5
1016
+ value: 34.331
1017
+ - type: precision_at_1
1018
+ value: 27.866000000000003
1019
+ - type: precision_at_10
1020
+ value: 6.917
1021
+ - type: precision_at_100
1022
+ value: 1.3599999999999999
1023
+ - type: precision_at_1000
1024
+ value: 0.233
1025
+ - type: precision_at_3
1026
+ value: 15.547
1027
+ - type: precision_at_5
1028
+ value: 10.791
1029
+ - type: recall_at_1
1030
+ value: 23.344
1031
+ - type: recall_at_10
1032
+ value: 45.782000000000004
1033
+ - type: recall_at_100
1034
+ value: 69.503
1035
+ - type: recall_at_1000
1036
+ value: 90.742
1037
+ - type: recall_at_3
1038
+ value: 35.160000000000004
1039
+ - type: recall_at_5
1040
+ value: 39.058
1041
+ - task:
1042
+ type: Retrieval
1043
+ dataset:
1044
+ type: BeIR/cqadupstack
1045
+ name: MTEB CQADupstackWordpressRetrieval
1046
+ config: default
1047
+ split: test
1048
+ revision: None
1049
+ metrics:
1050
+ - type: map_at_1
1051
+ value: 20.776
1052
+ - type: map_at_10
1053
+ value: 27.285999999999998
1054
+ - type: map_at_100
1055
+ value: 28.235
1056
+ - type: map_at_1000
1057
+ value: 28.337
1058
+ - type: map_at_3
1059
+ value: 25.147000000000002
1060
+ - type: map_at_5
1061
+ value: 26.401999999999997
1062
+ - type: mrr_at_1
1063
+ value: 22.921
1064
+ - type: mrr_at_10
1065
+ value: 29.409999999999997
1066
+ - type: mrr_at_100
1067
+ value: 30.275000000000002
1068
+ - type: mrr_at_1000
1069
+ value: 30.354999999999997
1070
+ - type: mrr_at_3
1071
+ value: 27.418
1072
+ - type: mrr_at_5
1073
+ value: 28.592000000000002
1074
+ - type: ndcg_at_1
1075
+ value: 22.921
1076
+ - type: ndcg_at_10
1077
+ value: 31.239
1078
+ - type: ndcg_at_100
1079
+ value: 35.965
1080
+ - type: ndcg_at_1000
1081
+ value: 38.602
1082
+ - type: ndcg_at_3
1083
+ value: 27.174
1084
+ - type: ndcg_at_5
1085
+ value: 29.229
1086
+ - type: precision_at_1
1087
+ value: 22.921
1088
+ - type: precision_at_10
1089
+ value: 4.806
1090
+ - type: precision_at_100
1091
+ value: 0.776
1092
+ - type: precision_at_1000
1093
+ value: 0.11
1094
+ - type: precision_at_3
1095
+ value: 11.459999999999999
1096
+ - type: precision_at_5
1097
+ value: 8.022
1098
+ - type: recall_at_1
1099
+ value: 20.776
1100
+ - type: recall_at_10
1101
+ value: 41.294
1102
+ - type: recall_at_100
1103
+ value: 63.111
1104
+ - type: recall_at_1000
1105
+ value: 82.88600000000001
1106
+ - type: recall_at_3
1107
+ value: 30.403000000000002
1108
+ - type: recall_at_5
1109
+ value: 35.455999999999996
1110
+ - task:
1111
+ type: Retrieval
1112
+ dataset:
1113
+ type: climate-fever
1114
+ name: MTEB ClimateFEVER
1115
+ config: default
1116
+ split: test
1117
+ revision: None
1118
+ metrics:
1119
+ - type: map_at_1
1120
+ value: 9.376
1121
+ - type: map_at_10
1122
+ value: 15.926000000000002
1123
+ - type: map_at_100
1124
+ value: 17.585
1125
+ - type: map_at_1000
1126
+ value: 17.776
1127
+ - type: map_at_3
1128
+ value: 13.014000000000001
1129
+ - type: map_at_5
1130
+ value: 14.417
1131
+ - type: mrr_at_1
1132
+ value: 20.195
1133
+ - type: mrr_at_10
1134
+ value: 29.95
1135
+ - type: mrr_at_100
1136
+ value: 31.052000000000003
1137
+ - type: mrr_at_1000
1138
+ value: 31.108000000000004
1139
+ - type: mrr_at_3
1140
+ value: 26.667
1141
+ - type: mrr_at_5
1142
+ value: 28.458
1143
+ - type: ndcg_at_1
1144
+ value: 20.195
1145
+ - type: ndcg_at_10
1146
+ value: 22.871
1147
+ - type: ndcg_at_100
1148
+ value: 29.921999999999997
1149
+ - type: ndcg_at_1000
1150
+ value: 33.672999999999995
1151
+ - type: ndcg_at_3
1152
+ value: 17.782999999999998
1153
+ - type: ndcg_at_5
1154
+ value: 19.544
1155
+ - type: precision_at_1
1156
+ value: 20.195
1157
+ - type: precision_at_10
1158
+ value: 7.394
1159
+ - type: precision_at_100
1160
+ value: 1.493
1161
+ - type: precision_at_1000
1162
+ value: 0.218
1163
+ - type: precision_at_3
1164
+ value: 13.073
1165
+ - type: precision_at_5
1166
+ value: 10.436
1167
+ - type: recall_at_1
1168
+ value: 9.376
1169
+ - type: recall_at_10
1170
+ value: 28.544999999999998
1171
+ - type: recall_at_100
1172
+ value: 53.147999999999996
1173
+ - type: recall_at_1000
1174
+ value: 74.62
1175
+ - type: recall_at_3
1176
+ value: 16.464000000000002
1177
+ - type: recall_at_5
1178
+ value: 21.004
1179
+ - task:
1180
+ type: Retrieval
1181
+ dataset:
1182
+ type: dbpedia-entity
1183
+ name: MTEB DBPedia
1184
+ config: default
1185
+ split: test
1186
+ revision: None
1187
+ metrics:
1188
+ - type: map_at_1
1189
+ value: 8.415000000000001
1190
+ - type: map_at_10
1191
+ value: 18.738
1192
+ - type: map_at_100
1193
+ value: 27.291999999999998
1194
+ - type: map_at_1000
1195
+ value: 28.992
1196
+ - type: map_at_3
1197
+ value: 13.196
1198
+ - type: map_at_5
1199
+ value: 15.539
1200
+ - type: mrr_at_1
1201
+ value: 66.5
1202
+ - type: mrr_at_10
1203
+ value: 74.518
1204
+ - type: mrr_at_100
1205
+ value: 74.86
1206
+ - type: mrr_at_1000
1207
+ value: 74.87
1208
+ - type: mrr_at_3
1209
+ value: 72.375
1210
+ - type: mrr_at_5
1211
+ value: 73.86200000000001
1212
+ - type: ndcg_at_1
1213
+ value: 54.37499999999999
1214
+ - type: ndcg_at_10
1215
+ value: 41.317
1216
+ - type: ndcg_at_100
1217
+ value: 45.845
1218
+ - type: ndcg_at_1000
1219
+ value: 52.92
1220
+ - type: ndcg_at_3
1221
+ value: 44.983000000000004
1222
+ - type: ndcg_at_5
1223
+ value: 42.989
1224
+ - type: precision_at_1
1225
+ value: 66.5
1226
+ - type: precision_at_10
1227
+ value: 33.6
1228
+ - type: precision_at_100
1229
+ value: 10.972999999999999
1230
+ - type: precision_at_1000
1231
+ value: 2.214
1232
+ - type: precision_at_3
1233
+ value: 48.583
1234
+ - type: precision_at_5
1235
+ value: 42.15
1236
+ - type: recall_at_1
1237
+ value: 8.415000000000001
1238
+ - type: recall_at_10
1239
+ value: 24.953
1240
+ - type: recall_at_100
1241
+ value: 52.48199999999999
1242
+ - type: recall_at_1000
1243
+ value: 75.093
1244
+ - type: recall_at_3
1245
+ value: 14.341000000000001
1246
+ - type: recall_at_5
1247
+ value: 18.468
1248
+ - task:
1249
+ type: Classification
1250
+ dataset:
1251
+ type: mteb/emotion
1252
+ name: MTEB EmotionClassification
1253
+ config: default
1254
+ split: test
1255
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1256
+ metrics:
1257
+ - type: accuracy
1258
+ value: 47.06499999999999
1259
+ - type: f1
1260
+ value: 41.439327599975385
1261
+ - task:
1262
+ type: Retrieval
1263
+ dataset:
1264
+ type: fever
1265
+ name: MTEB FEVER
1266
+ config: default
1267
+ split: test
1268
+ revision: None
1269
+ metrics:
1270
+ - type: map_at_1
1271
+ value: 66.02
1272
+ - type: map_at_10
1273
+ value: 76.68599999999999
1274
+ - type: map_at_100
1275
+ value: 76.959
1276
+ - type: map_at_1000
1277
+ value: 76.972
1278
+ - type: map_at_3
1279
+ value: 75.024
1280
+ - type: map_at_5
1281
+ value: 76.153
1282
+ - type: mrr_at_1
1283
+ value: 71.197
1284
+ - type: mrr_at_10
1285
+ value: 81.105
1286
+ - type: mrr_at_100
1287
+ value: 81.232
1288
+ - type: mrr_at_1000
1289
+ value: 81.233
1290
+ - type: mrr_at_3
1291
+ value: 79.758
1292
+ - type: mrr_at_5
1293
+ value: 80.69
1294
+ - type: ndcg_at_1
1295
+ value: 71.197
1296
+ - type: ndcg_at_10
1297
+ value: 81.644
1298
+ - type: ndcg_at_100
1299
+ value: 82.645
1300
+ - type: ndcg_at_1000
1301
+ value: 82.879
1302
+ - type: ndcg_at_3
1303
+ value: 78.792
1304
+ - type: ndcg_at_5
1305
+ value: 80.528
1306
+ - type: precision_at_1
1307
+ value: 71.197
1308
+ - type: precision_at_10
1309
+ value: 10.206999999999999
1310
+ - type: precision_at_100
1311
+ value: 1.093
1312
+ - type: precision_at_1000
1313
+ value: 0.11299999999999999
1314
+ - type: precision_at_3
1315
+ value: 30.868000000000002
1316
+ - type: precision_at_5
1317
+ value: 19.559
1318
+ - type: recall_at_1
1319
+ value: 66.02
1320
+ - type: recall_at_10
1321
+ value: 92.50699999999999
1322
+ - type: recall_at_100
1323
+ value: 96.497
1324
+ - type: recall_at_1000
1325
+ value: 97.956
1326
+ - type: recall_at_3
1327
+ value: 84.866
1328
+ - type: recall_at_5
1329
+ value: 89.16199999999999
1330
+ - task:
1331
+ type: Retrieval
1332
+ dataset:
1333
+ type: fiqa
1334
+ name: MTEB FiQA2018
1335
+ config: default
1336
+ split: test
1337
+ revision: None
1338
+ metrics:
1339
+ - type: map_at_1
1340
+ value: 17.948
1341
+ - type: map_at_10
1342
+ value: 29.833
1343
+ - type: map_at_100
1344
+ value: 31.487
1345
+ - type: map_at_1000
1346
+ value: 31.674000000000003
1347
+ - type: map_at_3
1348
+ value: 26.029999999999998
1349
+ - type: map_at_5
1350
+ value: 28.038999999999998
1351
+ - type: mrr_at_1
1352
+ value: 34.721999999999994
1353
+ - type: mrr_at_10
1354
+ value: 44.214999999999996
1355
+ - type: mrr_at_100
1356
+ value: 44.994
1357
+ - type: mrr_at_1000
1358
+ value: 45.051
1359
+ - type: mrr_at_3
1360
+ value: 41.667
1361
+ - type: mrr_at_5
1362
+ value: 43.032
1363
+ - type: ndcg_at_1
1364
+ value: 34.721999999999994
1365
+ - type: ndcg_at_10
1366
+ value: 37.434
1367
+ - type: ndcg_at_100
1368
+ value: 43.702000000000005
1369
+ - type: ndcg_at_1000
1370
+ value: 46.993
1371
+ - type: ndcg_at_3
1372
+ value: 33.56
1373
+ - type: ndcg_at_5
1374
+ value: 34.687
1375
+ - type: precision_at_1
1376
+ value: 34.721999999999994
1377
+ - type: precision_at_10
1378
+ value: 10.401
1379
+ - type: precision_at_100
1380
+ value: 1.7049999999999998
1381
+ - type: precision_at_1000
1382
+ value: 0.22799999999999998
1383
+ - type: precision_at_3
1384
+ value: 22.531000000000002
1385
+ - type: precision_at_5
1386
+ value: 16.42
1387
+ - type: recall_at_1
1388
+ value: 17.948
1389
+ - type: recall_at_10
1390
+ value: 45.062999999999995
1391
+ - type: recall_at_100
1392
+ value: 68.191
1393
+ - type: recall_at_1000
1394
+ value: 87.954
1395
+ - type: recall_at_3
1396
+ value: 31.112000000000002
1397
+ - type: recall_at_5
1398
+ value: 36.823
1399
+ - task:
1400
+ type: Retrieval
1401
+ dataset:
1402
+ type: hotpotqa
1403
+ name: MTEB HotpotQA
1404
+ config: default
1405
+ split: test
1406
+ revision: None
1407
+ metrics:
1408
+ - type: map_at_1
1409
+ value: 36.644
1410
+ - type: map_at_10
1411
+ value: 57.658
1412
+ - type: map_at_100
1413
+ value: 58.562000000000005
1414
+ - type: map_at_1000
1415
+ value: 58.62500000000001
1416
+ - type: map_at_3
1417
+ value: 54.022999999999996
1418
+ - type: map_at_5
1419
+ value: 56.293000000000006
1420
+ - type: mrr_at_1
1421
+ value: 73.288
1422
+ - type: mrr_at_10
1423
+ value: 80.51700000000001
1424
+ - type: mrr_at_100
1425
+ value: 80.72
1426
+ - type: mrr_at_1000
1427
+ value: 80.728
1428
+ - type: mrr_at_3
1429
+ value: 79.33200000000001
1430
+ - type: mrr_at_5
1431
+ value: 80.085
1432
+ - type: ndcg_at_1
1433
+ value: 73.288
1434
+ - type: ndcg_at_10
1435
+ value: 66.61
1436
+ - type: ndcg_at_100
1437
+ value: 69.723
1438
+ - type: ndcg_at_1000
1439
+ value: 70.96000000000001
1440
+ - type: ndcg_at_3
1441
+ value: 61.358999999999995
1442
+ - type: ndcg_at_5
1443
+ value: 64.277
1444
+ - type: precision_at_1
1445
+ value: 73.288
1446
+ - type: precision_at_10
1447
+ value: 14.17
1448
+ - type: precision_at_100
1449
+ value: 1.659
1450
+ - type: precision_at_1000
1451
+ value: 0.182
1452
+ - type: precision_at_3
1453
+ value: 39.487
1454
+ - type: precision_at_5
1455
+ value: 25.999
1456
+ - type: recall_at_1
1457
+ value: 36.644
1458
+ - type: recall_at_10
1459
+ value: 70.851
1460
+ - type: recall_at_100
1461
+ value: 82.94399999999999
1462
+ - type: recall_at_1000
1463
+ value: 91.134
1464
+ - type: recall_at_3
1465
+ value: 59.230000000000004
1466
+ - type: recall_at_5
1467
+ value: 64.997
1468
+ - task:
1469
+ type: Classification
1470
+ dataset:
1471
+ type: mteb/imdb
1472
+ name: MTEB ImdbClassification
1473
+ config: default
1474
+ split: test
1475
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1476
+ metrics:
1477
+ - type: accuracy
1478
+ value: 86.00280000000001
1479
+ - type: ap
1480
+ value: 80.46302061021223
1481
+ - type: f1
1482
+ value: 85.9592921596419
1483
+ - task:
1484
+ type: Retrieval
1485
+ dataset:
1486
+ type: msmarco
1487
+ name: MTEB MSMARCO
1488
+ config: default
1489
+ split: dev
1490
+ revision: None
1491
+ metrics:
1492
+ - type: map_at_1
1493
+ value: 22.541
1494
+ - type: map_at_10
1495
+ value: 34.625
1496
+ - type: map_at_100
1497
+ value: 35.785
1498
+ - type: map_at_1000
1499
+ value: 35.831
1500
+ - type: map_at_3
1501
+ value: 30.823
1502
+ - type: map_at_5
1503
+ value: 32.967999999999996
1504
+ - type: mrr_at_1
1505
+ value: 23.180999999999997
1506
+ - type: mrr_at_10
1507
+ value: 35.207
1508
+ - type: mrr_at_100
1509
+ value: 36.315
1510
+ - type: mrr_at_1000
1511
+ value: 36.355
1512
+ - type: mrr_at_3
1513
+ value: 31.483
1514
+ - type: mrr_at_5
1515
+ value: 33.589999999999996
1516
+ - type: ndcg_at_1
1517
+ value: 23.195
1518
+ - type: ndcg_at_10
1519
+ value: 41.461
1520
+ - type: ndcg_at_100
1521
+ value: 47.032000000000004
1522
+ - type: ndcg_at_1000
1523
+ value: 48.199999999999996
1524
+ - type: ndcg_at_3
1525
+ value: 33.702
1526
+ - type: ndcg_at_5
1527
+ value: 37.522
1528
+ - type: precision_at_1
1529
+ value: 23.195
1530
+ - type: precision_at_10
1531
+ value: 6.526999999999999
1532
+ - type: precision_at_100
1533
+ value: 0.932
1534
+ - type: precision_at_1000
1535
+ value: 0.10300000000000001
1536
+ - type: precision_at_3
1537
+ value: 14.308000000000002
1538
+ - type: precision_at_5
1539
+ value: 10.507
1540
+ - type: recall_at_1
1541
+ value: 22.541
1542
+ - type: recall_at_10
1543
+ value: 62.524
1544
+ - type: recall_at_100
1545
+ value: 88.228
1546
+ - type: recall_at_1000
1547
+ value: 97.243
1548
+ - type: recall_at_3
1549
+ value: 41.38
1550
+ - type: recall_at_5
1551
+ value: 50.55
1552
+ - task:
1553
+ type: Classification
1554
+ dataset:
1555
+ type: mteb/mtop_domain
1556
+ name: MTEB MTOPDomainClassification (en)
1557
+ config: en
1558
+ split: test
1559
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1560
+ metrics:
1561
+ - type: accuracy
1562
+ value: 92.69949840401279
1563
+ - type: f1
1564
+ value: 92.54141471311786
1565
+ - task:
1566
+ type: Classification
1567
+ dataset:
1568
+ type: mteb/mtop_intent
1569
+ name: MTEB MTOPIntentClassification (en)
1570
+ config: en
1571
+ split: test
1572
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1573
+ metrics:
1574
+ - type: accuracy
1575
+ value: 72.56041951664386
1576
+ - type: f1
1577
+ value: 55.88499977508287
1578
+ - task:
1579
+ type: Classification
1580
+ dataset:
1581
+ type: mteb/amazon_massive_intent
1582
+ name: MTEB MassiveIntentClassification (en)
1583
+ config: en
1584
+ split: test
1585
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1586
+ metrics:
1587
+ - type: accuracy
1588
+ value: 71.62071284465365
1589
+ - type: f1
1590
+ value: 69.36717546572152
1591
+ - task:
1592
+ type: Classification
1593
+ dataset:
1594
+ type: mteb/amazon_massive_scenario
1595
+ name: MTEB MassiveScenarioClassification (en)
1596
+ config: en
1597
+ split: test
1598
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1599
+ metrics:
1600
+ - type: accuracy
1601
+ value: 76.35843981170142
1602
+ - type: f1
1603
+ value: 76.15496453538884
1604
+ - task:
1605
+ type: Clustering
1606
+ dataset:
1607
+ type: mteb/medrxiv-clustering-p2p
1608
+ name: MTEB MedrxivClusteringP2P
1609
+ config: default
1610
+ split: test
1611
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1612
+ metrics:
1613
+ - type: v_measure
1614
+ value: 31.33664956793118
1615
+ - task:
1616
+ type: Clustering
1617
+ dataset:
1618
+ type: mteb/medrxiv-clustering-s2s
1619
+ name: MTEB MedrxivClusteringS2S
1620
+ config: default
1621
+ split: test
1622
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1623
+ metrics:
1624
+ - type: v_measure
1625
+ value: 27.883839621715524
1626
+ - task:
1627
+ type: Reranking
1628
+ dataset:
1629
+ type: mteb/mind_small
1630
+ name: MTEB MindSmallReranking
1631
+ config: default
1632
+ split: test
1633
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1634
+ metrics:
1635
+ - type: map
1636
+ value: 30.096874986740758
1637
+ - type: mrr
1638
+ value: 30.97300481932132
1639
+ - task:
1640
+ type: Retrieval
1641
+ dataset:
1642
+ type: nfcorpus
1643
+ name: MTEB NFCorpus
1644
+ config: default
1645
+ split: test
1646
+ revision: None
1647
+ metrics:
1648
+ - type: map_at_1
1649
+ value: 5.4
1650
+ - type: map_at_10
1651
+ value: 11.852
1652
+ - type: map_at_100
1653
+ value: 14.758
1654
+ - type: map_at_1000
1655
+ value: 16.134
1656
+ - type: map_at_3
1657
+ value: 8.558
1658
+ - type: map_at_5
1659
+ value: 10.087
1660
+ - type: mrr_at_1
1661
+ value: 44.272
1662
+ - type: mrr_at_10
1663
+ value: 52.05800000000001
1664
+ - type: mrr_at_100
1665
+ value: 52.689
1666
+ - type: mrr_at_1000
1667
+ value: 52.742999999999995
1668
+ - type: mrr_at_3
1669
+ value: 50.205999999999996
1670
+ - type: mrr_at_5
1671
+ value: 51.367
1672
+ - type: ndcg_at_1
1673
+ value: 42.57
1674
+ - type: ndcg_at_10
1675
+ value: 32.449
1676
+ - type: ndcg_at_100
1677
+ value: 29.596
1678
+ - type: ndcg_at_1000
1679
+ value: 38.351
1680
+ - type: ndcg_at_3
1681
+ value: 37.044
1682
+ - type: ndcg_at_5
1683
+ value: 35.275
1684
+ - type: precision_at_1
1685
+ value: 44.272
1686
+ - type: precision_at_10
1687
+ value: 23.87
1688
+ - type: precision_at_100
1689
+ value: 7.625
1690
+ - type: precision_at_1000
1691
+ value: 2.045
1692
+ - type: precision_at_3
1693
+ value: 34.365
1694
+ - type: precision_at_5
1695
+ value: 30.341
1696
+ - type: recall_at_1
1697
+ value: 5.4
1698
+ - type: recall_at_10
1699
+ value: 15.943999999999999
1700
+ - type: recall_at_100
1701
+ value: 29.805
1702
+ - type: recall_at_1000
1703
+ value: 61.695
1704
+ - type: recall_at_3
1705
+ value: 9.539
1706
+ - type: recall_at_5
1707
+ value: 12.127
1708
+ - task:
1709
+ type: Retrieval
1710
+ dataset:
1711
+ type: nq
1712
+ name: MTEB NQ
1713
+ config: default
1714
+ split: test
1715
+ revision: None
1716
+ metrics:
1717
+ - type: map_at_1
1718
+ value: 36.047000000000004
1719
+ - type: map_at_10
1720
+ value: 51.6
1721
+ - type: map_at_100
1722
+ value: 52.449999999999996
1723
+ - type: map_at_1000
1724
+ value: 52.476
1725
+ - type: map_at_3
1726
+ value: 47.452
1727
+ - type: map_at_5
1728
+ value: 49.964
1729
+ - type: mrr_at_1
1730
+ value: 40.382
1731
+ - type: mrr_at_10
1732
+ value: 54.273
1733
+ - type: mrr_at_100
1734
+ value: 54.859
1735
+ - type: mrr_at_1000
1736
+ value: 54.876000000000005
1737
+ - type: mrr_at_3
1738
+ value: 51.014
1739
+ - type: mrr_at_5
1740
+ value: 52.983999999999995
1741
+ - type: ndcg_at_1
1742
+ value: 40.353
1743
+ - type: ndcg_at_10
1744
+ value: 59.11300000000001
1745
+ - type: ndcg_at_100
1746
+ value: 62.604000000000006
1747
+ - type: ndcg_at_1000
1748
+ value: 63.187000000000005
1749
+ - type: ndcg_at_3
1750
+ value: 51.513
1751
+ - type: ndcg_at_5
1752
+ value: 55.576
1753
+ - type: precision_at_1
1754
+ value: 40.353
1755
+ - type: precision_at_10
1756
+ value: 9.418
1757
+ - type: precision_at_100
1758
+ value: 1.1440000000000001
1759
+ - type: precision_at_1000
1760
+ value: 0.12
1761
+ - type: precision_at_3
1762
+ value: 23.078000000000003
1763
+ - type: precision_at_5
1764
+ value: 16.250999999999998
1765
+ - type: recall_at_1
1766
+ value: 36.047000000000004
1767
+ - type: recall_at_10
1768
+ value: 79.22200000000001
1769
+ - type: recall_at_100
1770
+ value: 94.23
1771
+ - type: recall_at_1000
1772
+ value: 98.51100000000001
1773
+ - type: recall_at_3
1774
+ value: 59.678
1775
+ - type: recall_at_5
1776
+ value: 68.967
1777
+ - task:
1778
+ type: Retrieval
1779
+ dataset:
1780
+ type: quora
1781
+ name: MTEB QuoraRetrieval
1782
+ config: default
1783
+ split: test
1784
+ revision: None
1785
+ metrics:
1786
+ - type: map_at_1
1787
+ value: 68.232
1788
+ - type: map_at_10
1789
+ value: 81.674
1790
+ - type: map_at_100
1791
+ value: 82.338
1792
+ - type: map_at_1000
1793
+ value: 82.36099999999999
1794
+ - type: map_at_3
1795
+ value: 78.833
1796
+ - type: map_at_5
1797
+ value: 80.58
1798
+ - type: mrr_at_1
1799
+ value: 78.64
1800
+ - type: mrr_at_10
1801
+ value: 85.164
1802
+ - type: mrr_at_100
1803
+ value: 85.317
1804
+ - type: mrr_at_1000
1805
+ value: 85.319
1806
+ - type: mrr_at_3
1807
+ value: 84.127
1808
+ - type: mrr_at_5
1809
+ value: 84.789
1810
+ - type: ndcg_at_1
1811
+ value: 78.63
1812
+ - type: ndcg_at_10
1813
+ value: 85.711
1814
+ - type: ndcg_at_100
1815
+ value: 87.238
1816
+ - type: ndcg_at_1000
1817
+ value: 87.444
1818
+ - type: ndcg_at_3
1819
+ value: 82.788
1820
+ - type: ndcg_at_5
1821
+ value: 84.313
1822
+ - type: precision_at_1
1823
+ value: 78.63
1824
+ - type: precision_at_10
1825
+ value: 12.977
1826
+ - type: precision_at_100
1827
+ value: 1.503
1828
+ - type: precision_at_1000
1829
+ value: 0.156
1830
+ - type: precision_at_3
1831
+ value: 36.113
1832
+ - type: precision_at_5
1833
+ value: 23.71
1834
+ - type: recall_at_1
1835
+ value: 68.232
1836
+ - type: recall_at_10
1837
+ value: 93.30199999999999
1838
+ - type: recall_at_100
1839
+ value: 98.799
1840
+ - type: recall_at_1000
1841
+ value: 99.885
1842
+ - type: recall_at_3
1843
+ value: 84.827
1844
+ - type: recall_at_5
1845
+ value: 89.188
1846
+ - task:
1847
+ type: Clustering
1848
+ dataset:
1849
+ type: mteb/reddit-clustering
1850
+ name: MTEB RedditClustering
1851
+ config: default
1852
+ split: test
1853
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1854
+ metrics:
1855
+ - type: v_measure
1856
+ value: 45.71879170816294
1857
+ - task:
1858
+ type: Clustering
1859
+ dataset:
1860
+ type: mteb/reddit-clustering-p2p
1861
+ name: MTEB RedditClusteringP2P
1862
+ config: default
1863
+ split: test
1864
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1865
+ metrics:
1866
+ - type: v_measure
1867
+ value: 59.65866311751794
1868
+ - task:
1869
+ type: Retrieval
1870
+ dataset:
1871
+ type: scidocs
1872
+ name: MTEB SCIDOCS
1873
+ config: default
1874
+ split: test
1875
+ revision: None
1876
+ metrics:
1877
+ - type: map_at_1
1878
+ value: 4.218
1879
+ - type: map_at_10
1880
+ value: 10.337
1881
+ - type: map_at_100
1882
+ value: 12.131
1883
+ - type: map_at_1000
1884
+ value: 12.411
1885
+ - type: map_at_3
1886
+ value: 7.4270000000000005
1887
+ - type: map_at_5
1888
+ value: 8.913
1889
+ - type: mrr_at_1
1890
+ value: 20.8
1891
+ - type: mrr_at_10
1892
+ value: 30.868000000000002
1893
+ - type: mrr_at_100
1894
+ value: 31.903
1895
+ - type: mrr_at_1000
1896
+ value: 31.972
1897
+ - type: mrr_at_3
1898
+ value: 27.367
1899
+ - type: mrr_at_5
1900
+ value: 29.372
1901
+ - type: ndcg_at_1
1902
+ value: 20.8
1903
+ - type: ndcg_at_10
1904
+ value: 17.765
1905
+ - type: ndcg_at_100
1906
+ value: 24.914
1907
+ - type: ndcg_at_1000
1908
+ value: 30.206
1909
+ - type: ndcg_at_3
1910
+ value: 16.64
1911
+ - type: ndcg_at_5
1912
+ value: 14.712
1913
+ - type: precision_at_1
1914
+ value: 20.8
1915
+ - type: precision_at_10
1916
+ value: 9.24
1917
+ - type: precision_at_100
1918
+ value: 1.9560000000000002
1919
+ - type: precision_at_1000
1920
+ value: 0.32299999999999995
1921
+ - type: precision_at_3
1922
+ value: 15.467
1923
+ - type: precision_at_5
1924
+ value: 12.94
1925
+ - type: recall_at_1
1926
+ value: 4.218
1927
+ - type: recall_at_10
1928
+ value: 18.752
1929
+ - type: recall_at_100
1930
+ value: 39.7
1931
+ - type: recall_at_1000
1932
+ value: 65.57300000000001
1933
+ - type: recall_at_3
1934
+ value: 9.428
1935
+ - type: recall_at_5
1936
+ value: 13.133000000000001
1937
+ - task:
1938
+ type: STS
1939
+ dataset:
1940
+ type: mteb/sickr-sts
1941
+ name: MTEB SICK-R
1942
+ config: default
1943
+ split: test
1944
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1945
+ metrics:
1946
+ - type: cos_sim_pearson
1947
+ value: 83.04338850207233
1948
+ - type: cos_sim_spearman
1949
+ value: 78.5054651430423
1950
+ - type: euclidean_pearson
1951
+ value: 80.30739451228612
1952
+ - type: euclidean_spearman
1953
+ value: 78.48377464299097
1954
+ - type: manhattan_pearson
1955
+ value: 80.40795049052781
1956
+ - type: manhattan_spearman
1957
+ value: 78.49506205443114
1958
+ - task:
1959
+ type: STS
1960
+ dataset:
1961
+ type: mteb/sts12-sts
1962
+ name: MTEB STS12
1963
+ config: default
1964
+ split: test
1965
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1966
+ metrics:
1967
+ - type: cos_sim_pearson
1968
+ value: 84.11596224442962
1969
+ - type: cos_sim_spearman
1970
+ value: 76.20997388935461
1971
+ - type: euclidean_pearson
1972
+ value: 80.56858451349109
1973
+ - type: euclidean_spearman
1974
+ value: 75.92659183871186
1975
+ - type: manhattan_pearson
1976
+ value: 80.60246102203844
1977
+ - type: manhattan_spearman
1978
+ value: 76.03018971432664
1979
+ - task:
1980
+ type: STS
1981
+ dataset:
1982
+ type: mteb/sts13-sts
1983
+ name: MTEB STS13
1984
+ config: default
1985
+ split: test
1986
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1987
+ metrics:
1988
+ - type: cos_sim_pearson
1989
+ value: 81.34691640755737
1990
+ - type: cos_sim_spearman
1991
+ value: 82.4018369631579
1992
+ - type: euclidean_pearson
1993
+ value: 81.87673092245366
1994
+ - type: euclidean_spearman
1995
+ value: 82.3671489960678
1996
+ - type: manhattan_pearson
1997
+ value: 81.88222387719948
1998
+ - type: manhattan_spearman
1999
+ value: 82.3816590344736
2000
+ - task:
2001
+ type: STS
2002
+ dataset:
2003
+ type: mteb/sts14-sts
2004
+ name: MTEB STS14
2005
+ config: default
2006
+ split: test
2007
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
2008
+ metrics:
2009
+ - type: cos_sim_pearson
2010
+ value: 81.2836092579524
2011
+ - type: cos_sim_spearman
2012
+ value: 78.99982781772064
2013
+ - type: euclidean_pearson
2014
+ value: 80.5184271010527
2015
+ - type: euclidean_spearman
2016
+ value: 78.89777392101904
2017
+ - type: manhattan_pearson
2018
+ value: 80.53585705018664
2019
+ - type: manhattan_spearman
2020
+ value: 78.92898405472994
2021
+ - task:
2022
+ type: STS
2023
+ dataset:
2024
+ type: mteb/sts15-sts
2025
+ name: MTEB STS15
2026
+ config: default
2027
+ split: test
2028
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
2029
+ metrics:
2030
+ - type: cos_sim_pearson
2031
+ value: 86.7349907750784
2032
+ - type: cos_sim_spearman
2033
+ value: 87.7611234446225
2034
+ - type: euclidean_pearson
2035
+ value: 86.98759326731624
2036
+ - type: euclidean_spearman
2037
+ value: 87.58321319424618
2038
+ - type: manhattan_pearson
2039
+ value: 87.03483090370842
2040
+ - type: manhattan_spearman
2041
+ value: 87.63278333060288
2042
+ - task:
2043
+ type: STS
2044
+ dataset:
2045
+ type: mteb/sts16-sts
2046
+ name: MTEB STS16
2047
+ config: default
2048
+ split: test
2049
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
2050
+ metrics:
2051
+ - type: cos_sim_pearson
2052
+ value: 81.75873694924825
2053
+ - type: cos_sim_spearman
2054
+ value: 83.80237999094724
2055
+ - type: euclidean_pearson
2056
+ value: 83.55023725861537
2057
+ - type: euclidean_spearman
2058
+ value: 84.12744338577744
2059
+ - type: manhattan_pearson
2060
+ value: 83.58816983036232
2061
+ - type: manhattan_spearman
2062
+ value: 84.18520748676501
2063
+ - task:
2064
+ type: STS
2065
+ dataset:
2066
+ type: mteb/sts17-crosslingual-sts
2067
+ name: MTEB STS17 (en-en)
2068
+ config: en-en
2069
+ split: test
2070
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
2071
+ metrics:
2072
+ - type: cos_sim_pearson
2073
+ value: 87.21630882940174
2074
+ - type: cos_sim_spearman
2075
+ value: 87.72382883437031
2076
+ - type: euclidean_pearson
2077
+ value: 88.69933350930333
2078
+ - type: euclidean_spearman
2079
+ value: 88.24660814383081
2080
+ - type: manhattan_pearson
2081
+ value: 88.77331018833499
2082
+ - type: manhattan_spearman
2083
+ value: 88.26109989380632
2084
+ - task:
2085
+ type: STS
2086
+ dataset:
2087
+ type: mteb/sts22-crosslingual-sts
2088
+ name: MTEB STS22 (en)
2089
+ config: en
2090
+ split: test
2091
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2092
+ metrics:
2093
+ - type: cos_sim_pearson
2094
+ value: 61.11854063060489
2095
+ - type: cos_sim_spearman
2096
+ value: 63.14678634195072
2097
+ - type: euclidean_pearson
2098
+ value: 61.679090067000864
2099
+ - type: euclidean_spearman
2100
+ value: 62.28876589509653
2101
+ - type: manhattan_pearson
2102
+ value: 62.082324165511004
2103
+ - type: manhattan_spearman
2104
+ value: 62.56030932816679
2105
+ - task:
2106
+ type: STS
2107
+ dataset:
2108
+ type: mteb/stsbenchmark-sts
2109
+ name: MTEB STSBenchmark
2110
+ config: default
2111
+ split: test
2112
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2113
+ metrics:
2114
+ - type: cos_sim_pearson
2115
+ value: 84.00319882832645
2116
+ - type: cos_sim_spearman
2117
+ value: 85.94529772647257
2118
+ - type: euclidean_pearson
2119
+ value: 85.6661390122756
2120
+ - type: euclidean_spearman
2121
+ value: 85.97747815545827
2122
+ - type: manhattan_pearson
2123
+ value: 85.58422770541893
2124
+ - type: manhattan_spearman
2125
+ value: 85.9237139181532
2126
+ - task:
2127
+ type: Reranking
2128
+ dataset:
2129
+ type: mteb/scidocs-reranking
2130
+ name: MTEB SciDocsRR
2131
+ config: default
2132
+ split: test
2133
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2134
+ metrics:
2135
+ - type: map
2136
+ value: 79.16198731863916
2137
+ - type: mrr
2138
+ value: 94.25202702163487
2139
+ - task:
2140
+ type: Retrieval
2141
+ dataset:
2142
+ type: scifact
2143
+ name: MTEB SciFact
2144
+ config: default
2145
+ split: test
2146
+ revision: None
2147
+ metrics:
2148
+ - type: map_at_1
2149
+ value: 54.761
2150
+ - type: map_at_10
2151
+ value: 64.396
2152
+ - type: map_at_100
2153
+ value: 65.07
2154
+ - type: map_at_1000
2155
+ value: 65.09899999999999
2156
+ - type: map_at_3
2157
+ value: 61.846000000000004
2158
+ - type: map_at_5
2159
+ value: 63.284
2160
+ - type: mrr_at_1
2161
+ value: 57.667
2162
+ - type: mrr_at_10
2163
+ value: 65.83099999999999
2164
+ - type: mrr_at_100
2165
+ value: 66.36800000000001
2166
+ - type: mrr_at_1000
2167
+ value: 66.39399999999999
2168
+ - type: mrr_at_3
2169
+ value: 64.056
2170
+ - type: mrr_at_5
2171
+ value: 65.206
2172
+ - type: ndcg_at_1
2173
+ value: 57.667
2174
+ - type: ndcg_at_10
2175
+ value: 68.854
2176
+ - type: ndcg_at_100
2177
+ value: 71.59100000000001
2178
+ - type: ndcg_at_1000
2179
+ value: 72.383
2180
+ - type: ndcg_at_3
2181
+ value: 64.671
2182
+ - type: ndcg_at_5
2183
+ value: 66.796
2184
+ - type: precision_at_1
2185
+ value: 57.667
2186
+ - type: precision_at_10
2187
+ value: 9.167
2188
+ - type: precision_at_100
2189
+ value: 1.053
2190
+ - type: precision_at_1000
2191
+ value: 0.11199999999999999
2192
+ - type: precision_at_3
2193
+ value: 25.444
2194
+ - type: precision_at_5
2195
+ value: 16.667
2196
+ - type: recall_at_1
2197
+ value: 54.761
2198
+ - type: recall_at_10
2199
+ value: 80.9
2200
+ - type: recall_at_100
2201
+ value: 92.767
2202
+ - type: recall_at_1000
2203
+ value: 99
2204
+ - type: recall_at_3
2205
+ value: 69.672
2206
+ - type: recall_at_5
2207
+ value: 75.083
2208
+ - task:
2209
+ type: PairClassification
2210
+ dataset:
2211
+ type: mteb/sprintduplicatequestions-pairclassification
2212
+ name: MTEB SprintDuplicateQuestions
2213
+ config: default
2214
+ split: test
2215
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2216
+ metrics:
2217
+ - type: cos_sim_accuracy
2218
+ value: 99.8079207920792
2219
+ - type: cos_sim_ap
2220
+ value: 94.88470927617445
2221
+ - type: cos_sim_f1
2222
+ value: 90.08179959100204
2223
+ - type: cos_sim_precision
2224
+ value: 92.15481171548117
2225
+ - type: cos_sim_recall
2226
+ value: 88.1
2227
+ - type: dot_accuracy
2228
+ value: 99.58613861386138
2229
+ - type: dot_ap
2230
+ value: 82.94822578881316
2231
+ - type: dot_f1
2232
+ value: 77.33333333333333
2233
+ - type: dot_precision
2234
+ value: 79.36842105263158
2235
+ - type: dot_recall
2236
+ value: 75.4
2237
+ - type: euclidean_accuracy
2238
+ value: 99.8069306930693
2239
+ - type: euclidean_ap
2240
+ value: 94.81367858031837
2241
+ - type: euclidean_f1
2242
+ value: 90.01009081735621
2243
+ - type: euclidean_precision
2244
+ value: 90.83503054989816
2245
+ - type: euclidean_recall
2246
+ value: 89.2
2247
+ - type: manhattan_accuracy
2248
+ value: 99.81188118811882
2249
+ - type: manhattan_ap
2250
+ value: 94.91405337220161
2251
+ - type: manhattan_f1
2252
+ value: 90.2763561924258
2253
+ - type: manhattan_precision
2254
+ value: 92.45283018867924
2255
+ - type: manhattan_recall
2256
+ value: 88.2
2257
+ - type: max_accuracy
2258
+ value: 99.81188118811882
2259
+ - type: max_ap
2260
+ value: 94.91405337220161
2261
+ - type: max_f1
2262
+ value: 90.2763561924258
2263
+ - task:
2264
+ type: Clustering
2265
+ dataset:
2266
+ type: mteb/stackexchange-clustering
2267
+ name: MTEB StackExchangeClustering
2268
+ config: default
2269
+ split: test
2270
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2271
+ metrics:
2272
+ - type: v_measure
2273
+ value: 58.511599500053094
2274
+ - task:
2275
+ type: Clustering
2276
+ dataset:
2277
+ type: mteb/stackexchange-clustering-p2p
2278
+ name: MTEB StackExchangeClusteringP2P
2279
+ config: default
2280
+ split: test
2281
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2282
+ metrics:
2283
+ - type: v_measure
2284
+ value: 31.984728147814707
2285
+ - task:
2286
+ type: Reranking
2287
+ dataset:
2288
+ type: mteb/stackoverflowdupquestions-reranking
2289
+ name: MTEB StackOverflowDupQuestions
2290
+ config: default
2291
+ split: test
2292
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2293
+ metrics:
2294
+ - type: map
2295
+ value: 49.93428193939015
2296
+ - type: mrr
2297
+ value: 50.916557911043206
2298
+ - task:
2299
+ type: Summarization
2300
+ dataset:
2301
+ type: mteb/summeval
2302
+ name: MTEB SummEval
2303
+ config: default
2304
+ split: test
2305
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2306
+ metrics:
2307
+ - type: cos_sim_pearson
2308
+ value: 31.562500894537145
2309
+ - type: cos_sim_spearman
2310
+ value: 31.162587976726307
2311
+ - type: dot_pearson
2312
+ value: 22.633662187735762
2313
+ - type: dot_spearman
2314
+ value: 22.723000282378962
2315
+ - task:
2316
+ type: Retrieval
2317
+ dataset:
2318
+ type: trec-covid
2319
+ name: MTEB TRECCOVID
2320
+ config: default
2321
+ split: test
2322
+ revision: None
2323
+ metrics:
2324
+ - type: map_at_1
2325
+ value: 0.219
2326
+ - type: map_at_10
2327
+ value: 1.871
2328
+ - type: map_at_100
2329
+ value: 10.487
2330
+ - type: map_at_1000
2331
+ value: 25.122
2332
+ - type: map_at_3
2333
+ value: 0.657
2334
+ - type: map_at_5
2335
+ value: 1.0699999999999998
2336
+ - type: mrr_at_1
2337
+ value: 84
2338
+ - type: mrr_at_10
2339
+ value: 89.567
2340
+ - type: mrr_at_100
2341
+ value: 89.748
2342
+ - type: mrr_at_1000
2343
+ value: 89.748
2344
+ - type: mrr_at_3
2345
+ value: 88.667
2346
+ - type: mrr_at_5
2347
+ value: 89.567
2348
+ - type: ndcg_at_1
2349
+ value: 80
2350
+ - type: ndcg_at_10
2351
+ value: 74.533
2352
+ - type: ndcg_at_100
2353
+ value: 55.839000000000006
2354
+ - type: ndcg_at_1000
2355
+ value: 49.748
2356
+ - type: ndcg_at_3
2357
+ value: 79.53099999999999
2358
+ - type: ndcg_at_5
2359
+ value: 78.245
2360
+ - type: precision_at_1
2361
+ value: 84
2362
+ - type: precision_at_10
2363
+ value: 78.4
2364
+ - type: precision_at_100
2365
+ value: 56.99999999999999
2366
+ - type: precision_at_1000
2367
+ value: 21.98
2368
+ - type: precision_at_3
2369
+ value: 85.333
2370
+ - type: precision_at_5
2371
+ value: 84.8
2372
+ - type: recall_at_1
2373
+ value: 0.219
2374
+ - type: recall_at_10
2375
+ value: 2.02
2376
+ - type: recall_at_100
2377
+ value: 13.555
2378
+ - type: recall_at_1000
2379
+ value: 46.739999999999995
2380
+ - type: recall_at_3
2381
+ value: 0.685
2382
+ - type: recall_at_5
2383
+ value: 1.13
2384
+ - task:
2385
+ type: Retrieval
2386
+ dataset:
2387
+ type: webis-touche2020
2388
+ name: MTEB Touche2020
2389
+ config: default
2390
+ split: test
2391
+ revision: None
2392
+ metrics:
2393
+ - type: map_at_1
2394
+ value: 3.5029999999999997
2395
+ - type: map_at_10
2396
+ value: 11.042
2397
+ - type: map_at_100
2398
+ value: 16.326999999999998
2399
+ - type: map_at_1000
2400
+ value: 17.836
2401
+ - type: map_at_3
2402
+ value: 6.174
2403
+ - type: map_at_5
2404
+ value: 7.979
2405
+ - type: mrr_at_1
2406
+ value: 42.857
2407
+ - type: mrr_at_10
2408
+ value: 52.617000000000004
2409
+ - type: mrr_at_100
2410
+ value: 53.351000000000006
2411
+ - type: mrr_at_1000
2412
+ value: 53.351000000000006
2413
+ - type: mrr_at_3
2414
+ value: 46.939
2415
+ - type: mrr_at_5
2416
+ value: 50.714000000000006
2417
+ - type: ndcg_at_1
2418
+ value: 38.775999999999996
2419
+ - type: ndcg_at_10
2420
+ value: 27.125
2421
+ - type: ndcg_at_100
2422
+ value: 35.845
2423
+ - type: ndcg_at_1000
2424
+ value: 47.377
2425
+ - type: ndcg_at_3
2426
+ value: 29.633
2427
+ - type: ndcg_at_5
2428
+ value: 28.378999999999998
2429
+ - type: precision_at_1
2430
+ value: 42.857
2431
+ - type: precision_at_10
2432
+ value: 24.082
2433
+ - type: precision_at_100
2434
+ value: 6.877999999999999
2435
+ - type: precision_at_1000
2436
+ value: 1.463
2437
+ - type: precision_at_3
2438
+ value: 29.932
2439
+ - type: precision_at_5
2440
+ value: 28.571
2441
+ - type: recall_at_1
2442
+ value: 3.5029999999999997
2443
+ - type: recall_at_10
2444
+ value: 17.068
2445
+ - type: recall_at_100
2446
+ value: 43.361
2447
+ - type: recall_at_1000
2448
+ value: 78.835
2449
+ - type: recall_at_3
2450
+ value: 6.821000000000001
2451
+ - type: recall_at_5
2452
+ value: 10.357
2453
+ - task:
2454
+ type: Classification
2455
+ dataset:
2456
+ type: mteb/toxic_conversations_50k
2457
+ name: MTEB ToxicConversationsClassification
2458
+ config: default
2459
+ split: test
2460
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2461
+ metrics:
2462
+ - type: accuracy
2463
+ value: 71.0954
2464
+ - type: ap
2465
+ value: 14.216844153511959
2466
+ - type: f1
2467
+ value: 54.63687418565117
2468
+ - task:
2469
+ type: Classification
2470
+ dataset:
2471
+ type: mteb/tweet_sentiment_extraction
2472
+ name: MTEB TweetSentimentExtractionClassification
2473
+ config: default
2474
+ split: test
2475
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2476
+ metrics:
2477
+ - type: accuracy
2478
+ value: 61.46293152235427
2479
+ - type: f1
2480
+ value: 61.744177921638645
2481
+ - task:
2482
+ type: Clustering
2483
+ dataset:
2484
+ type: mteb/twentynewsgroups-clustering
2485
+ name: MTEB TwentyNewsgroupsClustering
2486
+ config: default
2487
+ split: test
2488
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2489
+ metrics:
2490
+ - type: v_measure
2491
+ value: 41.12708617788644
2492
+ - task:
2493
+ type: PairClassification
2494
+ dataset:
2495
+ type: mteb/twittersemeval2015-pairclassification
2496
+ name: MTEB TwitterSemEval2015
2497
+ config: default
2498
+ split: test
2499
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2500
+ metrics:
2501
+ - type: cos_sim_accuracy
2502
+ value: 85.75430649102938
2503
+ - type: cos_sim_ap
2504
+ value: 73.34252536948081
2505
+ - type: cos_sim_f1
2506
+ value: 67.53758935173774
2507
+ - type: cos_sim_precision
2508
+ value: 63.3672525439408
2509
+ - type: cos_sim_recall
2510
+ value: 72.29551451187335
2511
+ - type: dot_accuracy
2512
+ value: 81.71305954580676
2513
+ - type: dot_ap
2514
+ value: 59.5532209082386
2515
+ - type: dot_f1
2516
+ value: 56.18466898954705
2517
+ - type: dot_precision
2518
+ value: 47.830923248053395
2519
+ - type: dot_recall
2520
+ value: 68.07387862796834
2521
+ - type: euclidean_accuracy
2522
+ value: 85.81987244441795
2523
+ - type: euclidean_ap
2524
+ value: 73.34325409809446
2525
+ - type: euclidean_f1
2526
+ value: 67.83451360417443
2527
+ - type: euclidean_precision
2528
+ value: 64.09955388588871
2529
+ - type: euclidean_recall
2530
+ value: 72.0316622691293
2531
+ - type: manhattan_accuracy
2532
+ value: 85.68277999642368
2533
+ - type: manhattan_ap
2534
+ value: 73.1535450121903
2535
+ - type: manhattan_f1
2536
+ value: 67.928237896289
2537
+ - type: manhattan_precision
2538
+ value: 63.56945722171113
2539
+ - type: manhattan_recall
2540
+ value: 72.9287598944591
2541
+ - type: max_accuracy
2542
+ value: 85.81987244441795
2543
+ - type: max_ap
2544
+ value: 73.34325409809446
2545
+ - type: max_f1
2546
+ value: 67.928237896289
2547
+ - task:
2548
+ type: PairClassification
2549
+ dataset:
2550
+ type: mteb/twitterurlcorpus-pairclassification
2551
+ name: MTEB TwitterURLCorpus
2552
+ config: default
2553
+ split: test
2554
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2555
+ metrics:
2556
+ - type: cos_sim_accuracy
2557
+ value: 88.90441262079403
2558
+ - type: cos_sim_ap
2559
+ value: 85.79331880741438
2560
+ - type: cos_sim_f1
2561
+ value: 78.31563529842548
2562
+ - type: cos_sim_precision
2563
+ value: 74.6683424102779
2564
+ - type: cos_sim_recall
2565
+ value: 82.33754234678165
2566
+ - type: dot_accuracy
2567
+ value: 84.89928978926534
2568
+ - type: dot_ap
2569
+ value: 75.25819218316
2570
+ - type: dot_f1
2571
+ value: 69.88730119720536
2572
+ - type: dot_precision
2573
+ value: 64.23362374959665
2574
+ - type: dot_recall
2575
+ value: 76.63227594702803
2576
+ - type: euclidean_accuracy
2577
+ value: 89.01695967710637
2578
+ - type: euclidean_ap
2579
+ value: 85.98986606038852
2580
+ - type: euclidean_f1
2581
+ value: 78.5277880014722
2582
+ - type: euclidean_precision
2583
+ value: 75.22211253701876
2584
+ - type: euclidean_recall
2585
+ value: 82.13735756082538
2586
+ - type: manhattan_accuracy
2587
+ value: 88.99561454573679
2588
+ - type: manhattan_ap
2589
+ value: 85.92262421793953
2590
+ - type: manhattan_f1
2591
+ value: 78.38866094740769
2592
+ - type: manhattan_precision
2593
+ value: 76.02373028505282
2594
+ - type: manhattan_recall
2595
+ value: 80.9054511857099
2596
+ - type: max_accuracy
2597
+ value: 89.01695967710637
2598
+ - type: max_ap
2599
+ value: 85.98986606038852
2600
+ - type: max_f1
2601
+ value: 78.5277880014722
2602
+ language:
2603
+ - en
2604
+ license: mit
2605
+ duplicated_from: michaelfeil/ct2fast-e5-small-v2
2606
+ ---
2607
+ # # Fast-Inference with Ctranslate2
2608
+ Speedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU.
2609
+
2610
+ quantized version of [intfloat/e5-small-v2](https://huggingface.co/intfloat/e5-small-v2)
2611
+ ```bash
2612
+ pip install hf-hub-ctranslate2>=2.12.0 ctranslate2>=3.16.0
2613
+ ```
2614
+
2615
+ ```python
2616
+ # from transformers import AutoTokenizer
2617
+ model_name = "michaelfeil/ct2fast-e5-small-v2"
2618
+ model_name_orig="intfloat/e5-small-v2"
2619
+
2620
+ from hf_hub_ctranslate2 import EncoderCT2fromHfHub
2621
+ model = EncoderCT2fromHfHub(
2622
+ # load in int8 on CUDA
2623
+ model_name_or_path=model_name,
2624
+ device="cuda",
2625
+ compute_type="int8_float16"
2626
+ )
2627
+ outputs = model.generate(
2628
+ text=["I like soccer", "I like tennis", "The eiffel tower is in Paris"],
2629
+ max_length=64,
2630
+ ) # perform downstream tasks on outputs
2631
+ outputs["pooler_output"]
2632
+ outputs["last_hidden_state"]
2633
+ outputs["attention_mask"]
2634
+
2635
+ # alternative, use SentenceTransformer Mix-In
2636
+ # for end-to-end Sentence embeddings generation
2637
+ # (not pulling from this CT2fast-HF repo)
2638
+
2639
+ from hf_hub_ctranslate2 import CT2SentenceTransformer
2640
+ model = CT2SentenceTransformer(
2641
+ model_name_orig, compute_type="int8_float16", device="cuda"
2642
+ )
2643
+ embeddings = model.encode(
2644
+ ["I like soccer", "I like tennis", "The eiffel tower is in Paris"],
2645
+ batch_size=32,
2646
+ convert_to_numpy=True,
2647
+ normalize_embeddings=True,
2648
+ )
2649
+ print(embeddings.shape, embeddings)
2650
+ scores = (embeddings @ embeddings.T) * 100
2651
+
2652
+ ```
2653
+
2654
+ Checkpoint compatible to [ctranslate2>=3.16.0](https://github.com/OpenNMT/CTranslate2)
2655
+ and [hf-hub-ctranslate2>=2.12.0](https://github.com/michaelfeil/hf-hub-ctranslate2)
2656
+ - `compute_type=int8_float16` for `device="cuda"`
2657
+ - `compute_type=int8` for `device="cpu"`
2658
+
2659
+ Converted on 2023-06-19 using
2660
+ ```
2661
+ ct2-transformers-converter --model intfloat/e5-small-v2 --output_dir ~/tmp-ct2fast-e5-small-v2 --force --copy_files tokenizer.json modules.json README.md tokenizer_config.json sentence_bert_config.json vocab.txt special_tokens_map.json .gitattributes --trust_remote_code
2662
+ ```
2663
+
2664
+ # Licence and other remarks:
2665
+ This is just a quantized version. Licence conditions are intended to be idential to original huggingface repo.
2666
+
2667
+ # Original description
2668
+
2669
+
2670
+ # E5-small-v2
2671
+
2672
+ [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
2673
+ Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
2674
+
2675
+ This model has 12 layers and the embedding size is 384.
2676
+
2677
+ ## Usage
2678
+
2679
+ Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
2680
+
2681
+ ```python
2682
+ import torch.nn.functional as F
2683
+
2684
+ from torch import Tensor
2685
+ from transformers import AutoTokenizer, AutoModel
2686
+
2687
+
2688
+ def average_pool(last_hidden_states: Tensor,
2689
+ attention_mask: Tensor) -> Tensor:
2690
+ last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
2691
+ return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
2692
+
2693
+
2694
+ # Each input text should start with "query: " or "passage: ".
2695
+ # For tasks other than retrieval, you can simply use the "query: " prefix.
2696
+ input_texts = ['query: how much protein should a female eat',
2697
+ 'query: summit define',
2698
+ "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
2699
+ "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
2700
+
2701
+ tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-small-v2')
2702
+ model = AutoModel.from_pretrained('intfloat/e5-small-v2')
2703
+
2704
+ # Tokenize the input texts
2705
+ batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
2706
+
2707
+ outputs = model(**batch_dict)
2708
+ embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
2709
+
2710
+ # (Optionally) normalize embeddings
2711
+ embeddings = F.normalize(embeddings, p=2, dim=1)
2712
+ scores = (embeddings[:2] @ embeddings[2:].T) * 100
2713
+ print(scores.tolist())
2714
+ ```
2715
+
2716
+ ## Training Details
2717
+
2718
+ Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
2719
+
2720
+ ## Benchmark Evaluation
2721
+
2722
+ Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
2723
+ on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
2724
+
2725
+ ## Citation
2726
+
2727
+ If you find our paper or models helpful, please consider cite as follows:
2728
+
2729
+ ```
2730
+ @article{wang2022text,
2731
+ title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
2732
+ author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
2733
+ journal={arXiv preprint arXiv:2212.03533},
2734
+ year={2022}
2735
+ }
2736
+ ```
2737
+
2738
+ ## Limitations
2739
+
2740
+ This model only works for English texts. Long texts will be truncated to at most 512 tokens.
2741
+
2742
+ ## Sentence Transformers
2743
+
2744
+ Below is an example for usage with sentence_transformers. `pip install sentence_transformers~=2.2.2`
2745
+ This is community contributed, and results may vary up to numerical precision.
2746
+ ```python
2747
+ from sentence_transformers import SentenceTransformer
2748
+ model = SentenceTransformer('intfloat/e5-small-v2')
2749
+ embeddings = model.encode(input_texts, normalize_embeddings=True)
2750
+ ```
config.json ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "tmp/",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 384,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 1536,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 12,
17
+ "num_hidden_layers": 12,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "torch_dtype": "float32",
21
+ "transformers_version": "4.29.0.dev0",
22
+ "type_vocab_size": 2,
23
+ "use_cache": true,
24
+ "vocab_size": 30522,
25
+ "bos_token": "<s>",
26
+ "eos_token": "</s>",
27
+ "layer_norm_epsilon": 1e-12,
28
+ "unk_token": "[UNK]"
29
+ }
model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a201dec3f480d71769e0126bc25d0da451185a75f654d1ece99a26be83dd02aa
3
+ size 133448364
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "clean_up_tokenization_spaces": true,
3
+ "cls_token": "[CLS]",
4
+ "do_basic_tokenize": true,
5
+ "do_lower_case": true,
6
+ "mask_token": "[MASK]",
7
+ "model_max_length": 1000000000000000019884624838656,
8
+ "never_split": null,
9
+ "pad_token": "[PAD]",
10
+ "sep_token": "[SEP]",
11
+ "strip_accents": null,
12
+ "tokenize_chinese_chars": true,
13
+ "tokenizer_class": "BertTokenizer",
14
+ "unk_token": "[UNK]"
15
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff
 
vocabulary.json ADDED
The diff for this file is too large to render. See raw diff
 
vocabulary.txt ADDED
The diff for this file is too large to render. See raw diff