Transformers
English
mteb
Eval Results
Inference Endpoints
jncraton commited on
Commit
8b3d53b
1 Parent(s): 75a3202
Files changed (8) hide show
  1. README.md +2684 -0
  2. config.json +6 -0
  3. model.bin +3 -0
  4. special_tokens_map.json +7 -0
  5. tokenizer.json +0 -0
  6. tokenizer_config.json +15 -0
  7. vocab.txt +0 -0
  8. vocabulary.json +0 -0
README.md ADDED
@@ -0,0 +1,2684 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - mteb
4
+ model-index:
5
+ - name: e5-small-v2
6
+ results:
7
+ - task:
8
+ type: Classification
9
+ dataset:
10
+ type: mteb/amazon_counterfactual
11
+ name: MTEB AmazonCounterfactualClassification (en)
12
+ config: en
13
+ split: test
14
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
15
+ metrics:
16
+ - type: accuracy
17
+ value: 77.59701492537313
18
+ - type: ap
19
+ value: 41.67064885731708
20
+ - type: f1
21
+ value: 71.86465946398573
22
+ - task:
23
+ type: Classification
24
+ dataset:
25
+ type: mteb/amazon_polarity
26
+ name: MTEB AmazonPolarityClassification
27
+ config: default
28
+ split: test
29
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
30
+ metrics:
31
+ - type: accuracy
32
+ value: 91.265875
33
+ - type: ap
34
+ value: 87.67633085349644
35
+ - type: f1
36
+ value: 91.24297521425744
37
+ - task:
38
+ type: Classification
39
+ dataset:
40
+ type: mteb/amazon_reviews_multi
41
+ name: MTEB AmazonReviewsClassification (en)
42
+ config: en
43
+ split: test
44
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
45
+ metrics:
46
+ - type: accuracy
47
+ value: 45.882000000000005
48
+ - type: f1
49
+ value: 45.08058870381236
50
+ - task:
51
+ type: Retrieval
52
+ dataset:
53
+ type: arguana
54
+ name: MTEB ArguAna
55
+ config: default
56
+ split: test
57
+ revision: None
58
+ metrics:
59
+ - type: map_at_1
60
+ value: 20.697
61
+ - type: map_at_10
62
+ value: 33.975
63
+ - type: map_at_100
64
+ value: 35.223
65
+ - type: map_at_1000
66
+ value: 35.260000000000005
67
+ - type: map_at_3
68
+ value: 29.776999999999997
69
+ - type: map_at_5
70
+ value: 32.035000000000004
71
+ - type: mrr_at_1
72
+ value: 20.982
73
+ - type: mrr_at_10
74
+ value: 34.094
75
+ - type: mrr_at_100
76
+ value: 35.343
77
+ - type: mrr_at_1000
78
+ value: 35.38
79
+ - type: mrr_at_3
80
+ value: 29.884
81
+ - type: mrr_at_5
82
+ value: 32.141999999999996
83
+ - type: ndcg_at_1
84
+ value: 20.697
85
+ - type: ndcg_at_10
86
+ value: 41.668
87
+ - type: ndcg_at_100
88
+ value: 47.397
89
+ - type: ndcg_at_1000
90
+ value: 48.305
91
+ - type: ndcg_at_3
92
+ value: 32.928000000000004
93
+ - type: ndcg_at_5
94
+ value: 36.998999999999995
95
+ - type: precision_at_1
96
+ value: 20.697
97
+ - type: precision_at_10
98
+ value: 6.636
99
+ - type: precision_at_100
100
+ value: 0.924
101
+ - type: precision_at_1000
102
+ value: 0.099
103
+ - type: precision_at_3
104
+ value: 14.035
105
+ - type: precision_at_5
106
+ value: 10.398
107
+ - type: recall_at_1
108
+ value: 20.697
109
+ - type: recall_at_10
110
+ value: 66.35799999999999
111
+ - type: recall_at_100
112
+ value: 92.39
113
+ - type: recall_at_1000
114
+ value: 99.36
115
+ - type: recall_at_3
116
+ value: 42.105
117
+ - type: recall_at_5
118
+ value: 51.991
119
+ - task:
120
+ type: Clustering
121
+ dataset:
122
+ type: mteb/arxiv-clustering-p2p
123
+ name: MTEB ArxivClusteringP2P
124
+ config: default
125
+ split: test
126
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
127
+ metrics:
128
+ - type: v_measure
129
+ value: 42.1169517447068
130
+ - task:
131
+ type: Clustering
132
+ dataset:
133
+ type: mteb/arxiv-clustering-s2s
134
+ name: MTEB ArxivClusteringS2S
135
+ config: default
136
+ split: test
137
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
138
+ metrics:
139
+ - type: v_measure
140
+ value: 34.79553720107097
141
+ - task:
142
+ type: Reranking
143
+ dataset:
144
+ type: mteb/askubuntudupquestions-reranking
145
+ name: MTEB AskUbuntuDupQuestions
146
+ config: default
147
+ split: test
148
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
149
+ metrics:
150
+ - type: map
151
+ value: 58.10811337308168
152
+ - type: mrr
153
+ value: 71.56410763751482
154
+ - task:
155
+ type: STS
156
+ dataset:
157
+ type: mteb/biosses-sts
158
+ name: MTEB BIOSSES
159
+ config: default
160
+ split: test
161
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
162
+ metrics:
163
+ - type: cos_sim_pearson
164
+ value: 78.46834918248696
165
+ - type: cos_sim_spearman
166
+ value: 79.4289182755206
167
+ - type: euclidean_pearson
168
+ value: 76.26662973727008
169
+ - type: euclidean_spearman
170
+ value: 78.11744260952536
171
+ - type: manhattan_pearson
172
+ value: 76.08175262609434
173
+ - type: manhattan_spearman
174
+ value: 78.29395265552289
175
+ - task:
176
+ type: Classification
177
+ dataset:
178
+ type: mteb/banking77
179
+ name: MTEB Banking77Classification
180
+ config: default
181
+ split: test
182
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
183
+ metrics:
184
+ - type: accuracy
185
+ value: 81.63636363636364
186
+ - type: f1
187
+ value: 81.55779952376953
188
+ - task:
189
+ type: Clustering
190
+ dataset:
191
+ type: mteb/biorxiv-clustering-p2p
192
+ name: MTEB BiorxivClusteringP2P
193
+ config: default
194
+ split: test
195
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
196
+ metrics:
197
+ - type: v_measure
198
+ value: 35.88541137137571
199
+ - task:
200
+ type: Clustering
201
+ dataset:
202
+ type: mteb/biorxiv-clustering-s2s
203
+ name: MTEB BiorxivClusteringS2S
204
+ config: default
205
+ split: test
206
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
207
+ metrics:
208
+ - type: v_measure
209
+ value: 30.05205685274407
210
+ - task:
211
+ type: Retrieval
212
+ dataset:
213
+ type: BeIR/cqadupstack
214
+ name: MTEB CQADupstackAndroidRetrieval
215
+ config: default
216
+ split: test
217
+ revision: None
218
+ metrics:
219
+ - type: map_at_1
220
+ value: 30.293999999999997
221
+ - type: map_at_10
222
+ value: 39.876
223
+ - type: map_at_100
224
+ value: 41.315000000000005
225
+ - type: map_at_1000
226
+ value: 41.451
227
+ - type: map_at_3
228
+ value: 37.194
229
+ - type: map_at_5
230
+ value: 38.728
231
+ - type: mrr_at_1
232
+ value: 37.053000000000004
233
+ - type: mrr_at_10
234
+ value: 45.281
235
+ - type: mrr_at_100
236
+ value: 46.188
237
+ - type: mrr_at_1000
238
+ value: 46.245999999999995
239
+ - type: mrr_at_3
240
+ value: 43.228
241
+ - type: mrr_at_5
242
+ value: 44.366
243
+ - type: ndcg_at_1
244
+ value: 37.053000000000004
245
+ - type: ndcg_at_10
246
+ value: 45.086
247
+ - type: ndcg_at_100
248
+ value: 50.756
249
+ - type: ndcg_at_1000
250
+ value: 53.123
251
+ - type: ndcg_at_3
252
+ value: 41.416
253
+ - type: ndcg_at_5
254
+ value: 43.098
255
+ - type: precision_at_1
256
+ value: 37.053000000000004
257
+ - type: precision_at_10
258
+ value: 8.34
259
+ - type: precision_at_100
260
+ value: 1.346
261
+ - type: precision_at_1000
262
+ value: 0.186
263
+ - type: precision_at_3
264
+ value: 19.647000000000002
265
+ - type: precision_at_5
266
+ value: 13.877
267
+ - type: recall_at_1
268
+ value: 30.293999999999997
269
+ - type: recall_at_10
270
+ value: 54.309
271
+ - type: recall_at_100
272
+ value: 78.59
273
+ - type: recall_at_1000
274
+ value: 93.82300000000001
275
+ - type: recall_at_3
276
+ value: 43.168
277
+ - type: recall_at_5
278
+ value: 48.192
279
+ - task:
280
+ type: Retrieval
281
+ dataset:
282
+ type: BeIR/cqadupstack
283
+ name: MTEB CQADupstackEnglishRetrieval
284
+ config: default
285
+ split: test
286
+ revision: None
287
+ metrics:
288
+ - type: map_at_1
289
+ value: 28.738000000000003
290
+ - type: map_at_10
291
+ value: 36.925999999999995
292
+ - type: map_at_100
293
+ value: 38.017
294
+ - type: map_at_1000
295
+ value: 38.144
296
+ - type: map_at_3
297
+ value: 34.446
298
+ - type: map_at_5
299
+ value: 35.704
300
+ - type: mrr_at_1
301
+ value: 35.478
302
+ - type: mrr_at_10
303
+ value: 42.786
304
+ - type: mrr_at_100
305
+ value: 43.458999999999996
306
+ - type: mrr_at_1000
307
+ value: 43.507
308
+ - type: mrr_at_3
309
+ value: 40.648
310
+ - type: mrr_at_5
311
+ value: 41.804
312
+ - type: ndcg_at_1
313
+ value: 35.478
314
+ - type: ndcg_at_10
315
+ value: 42.044
316
+ - type: ndcg_at_100
317
+ value: 46.249
318
+ - type: ndcg_at_1000
319
+ value: 48.44
320
+ - type: ndcg_at_3
321
+ value: 38.314
322
+ - type: ndcg_at_5
323
+ value: 39.798
324
+ - type: precision_at_1
325
+ value: 35.478
326
+ - type: precision_at_10
327
+ value: 7.764
328
+ - type: precision_at_100
329
+ value: 1.253
330
+ - type: precision_at_1000
331
+ value: 0.174
332
+ - type: precision_at_3
333
+ value: 18.047
334
+ - type: precision_at_5
335
+ value: 12.637
336
+ - type: recall_at_1
337
+ value: 28.738000000000003
338
+ - type: recall_at_10
339
+ value: 50.659
340
+ - type: recall_at_100
341
+ value: 68.76299999999999
342
+ - type: recall_at_1000
343
+ value: 82.811
344
+ - type: recall_at_3
345
+ value: 39.536
346
+ - type: recall_at_5
347
+ value: 43.763999999999996
348
+ - task:
349
+ type: Retrieval
350
+ dataset:
351
+ type: BeIR/cqadupstack
352
+ name: MTEB CQADupstackGamingRetrieval
353
+ config: default
354
+ split: test
355
+ revision: None
356
+ metrics:
357
+ - type: map_at_1
358
+ value: 38.565
359
+ - type: map_at_10
360
+ value: 50.168
361
+ - type: map_at_100
362
+ value: 51.11
363
+ - type: map_at_1000
364
+ value: 51.173
365
+ - type: map_at_3
366
+ value: 47.044000000000004
367
+ - type: map_at_5
368
+ value: 48.838
369
+ - type: mrr_at_1
370
+ value: 44.201
371
+ - type: mrr_at_10
372
+ value: 53.596999999999994
373
+ - type: mrr_at_100
374
+ value: 54.211
375
+ - type: mrr_at_1000
376
+ value: 54.247
377
+ - type: mrr_at_3
378
+ value: 51.202000000000005
379
+ - type: mrr_at_5
380
+ value: 52.608999999999995
381
+ - type: ndcg_at_1
382
+ value: 44.201
383
+ - type: ndcg_at_10
384
+ value: 55.694
385
+ - type: ndcg_at_100
386
+ value: 59.518
387
+ - type: ndcg_at_1000
388
+ value: 60.907
389
+ - type: ndcg_at_3
390
+ value: 50.395999999999994
391
+ - type: ndcg_at_5
392
+ value: 53.022999999999996
393
+ - type: precision_at_1
394
+ value: 44.201
395
+ - type: precision_at_10
396
+ value: 8.84
397
+ - type: precision_at_100
398
+ value: 1.162
399
+ - type: precision_at_1000
400
+ value: 0.133
401
+ - type: precision_at_3
402
+ value: 22.153
403
+ - type: precision_at_5
404
+ value: 15.260000000000002
405
+ - type: recall_at_1
406
+ value: 38.565
407
+ - type: recall_at_10
408
+ value: 68.65
409
+ - type: recall_at_100
410
+ value: 85.37400000000001
411
+ - type: recall_at_1000
412
+ value: 95.37400000000001
413
+ - type: recall_at_3
414
+ value: 54.645999999999994
415
+ - type: recall_at_5
416
+ value: 60.958
417
+ - task:
418
+ type: Retrieval
419
+ dataset:
420
+ type: BeIR/cqadupstack
421
+ name: MTEB CQADupstackGisRetrieval
422
+ config: default
423
+ split: test
424
+ revision: None
425
+ metrics:
426
+ - type: map_at_1
427
+ value: 23.945
428
+ - type: map_at_10
429
+ value: 30.641000000000002
430
+ - type: map_at_100
431
+ value: 31.599
432
+ - type: map_at_1000
433
+ value: 31.691000000000003
434
+ - type: map_at_3
435
+ value: 28.405
436
+ - type: map_at_5
437
+ value: 29.704000000000004
438
+ - type: mrr_at_1
439
+ value: 25.537
440
+ - type: mrr_at_10
441
+ value: 32.22
442
+ - type: mrr_at_100
443
+ value: 33.138
444
+ - type: mrr_at_1000
445
+ value: 33.214
446
+ - type: mrr_at_3
447
+ value: 30.151
448
+ - type: mrr_at_5
449
+ value: 31.298
450
+ - type: ndcg_at_1
451
+ value: 25.537
452
+ - type: ndcg_at_10
453
+ value: 34.638000000000005
454
+ - type: ndcg_at_100
455
+ value: 39.486
456
+ - type: ndcg_at_1000
457
+ value: 41.936
458
+ - type: ndcg_at_3
459
+ value: 30.333
460
+ - type: ndcg_at_5
461
+ value: 32.482
462
+ - type: precision_at_1
463
+ value: 25.537
464
+ - type: precision_at_10
465
+ value: 5.153
466
+ - type: precision_at_100
467
+ value: 0.7929999999999999
468
+ - type: precision_at_1000
469
+ value: 0.104
470
+ - type: precision_at_3
471
+ value: 12.429
472
+ - type: precision_at_5
473
+ value: 8.723
474
+ - type: recall_at_1
475
+ value: 23.945
476
+ - type: recall_at_10
477
+ value: 45.412
478
+ - type: recall_at_100
479
+ value: 67.836
480
+ - type: recall_at_1000
481
+ value: 86.467
482
+ - type: recall_at_3
483
+ value: 34.031
484
+ - type: recall_at_5
485
+ value: 39.039
486
+ - task:
487
+ type: Retrieval
488
+ dataset:
489
+ type: BeIR/cqadupstack
490
+ name: MTEB CQADupstackMathematicaRetrieval
491
+ config: default
492
+ split: test
493
+ revision: None
494
+ metrics:
495
+ - type: map_at_1
496
+ value: 14.419
497
+ - type: map_at_10
498
+ value: 20.858999999999998
499
+ - type: map_at_100
500
+ value: 22.067999999999998
501
+ - type: map_at_1000
502
+ value: 22.192
503
+ - type: map_at_3
504
+ value: 18.673000000000002
505
+ - type: map_at_5
506
+ value: 19.968
507
+ - type: mrr_at_1
508
+ value: 17.785999999999998
509
+ - type: mrr_at_10
510
+ value: 24.878
511
+ - type: mrr_at_100
512
+ value: 26.021
513
+ - type: mrr_at_1000
514
+ value: 26.095000000000002
515
+ - type: mrr_at_3
516
+ value: 22.616
517
+ - type: mrr_at_5
518
+ value: 23.785
519
+ - type: ndcg_at_1
520
+ value: 17.785999999999998
521
+ - type: ndcg_at_10
522
+ value: 25.153
523
+ - type: ndcg_at_100
524
+ value: 31.05
525
+ - type: ndcg_at_1000
526
+ value: 34.052
527
+ - type: ndcg_at_3
528
+ value: 21.117
529
+ - type: ndcg_at_5
530
+ value: 23.048
531
+ - type: precision_at_1
532
+ value: 17.785999999999998
533
+ - type: precision_at_10
534
+ value: 4.590000000000001
535
+ - type: precision_at_100
536
+ value: 0.864
537
+ - type: precision_at_1000
538
+ value: 0.125
539
+ - type: precision_at_3
540
+ value: 9.908999999999999
541
+ - type: precision_at_5
542
+ value: 7.313
543
+ - type: recall_at_1
544
+ value: 14.419
545
+ - type: recall_at_10
546
+ value: 34.477999999999994
547
+ - type: recall_at_100
548
+ value: 60.02499999999999
549
+ - type: recall_at_1000
550
+ value: 81.646
551
+ - type: recall_at_3
552
+ value: 23.515
553
+ - type: recall_at_5
554
+ value: 28.266999999999996
555
+ - task:
556
+ type: Retrieval
557
+ dataset:
558
+ type: BeIR/cqadupstack
559
+ name: MTEB CQADupstackPhysicsRetrieval
560
+ config: default
561
+ split: test
562
+ revision: None
563
+ metrics:
564
+ - type: map_at_1
565
+ value: 26.268
566
+ - type: map_at_10
567
+ value: 35.114000000000004
568
+ - type: map_at_100
569
+ value: 36.212
570
+ - type: map_at_1000
571
+ value: 36.333
572
+ - type: map_at_3
573
+ value: 32.436
574
+ - type: map_at_5
575
+ value: 33.992
576
+ - type: mrr_at_1
577
+ value: 31.761
578
+ - type: mrr_at_10
579
+ value: 40.355999999999995
580
+ - type: mrr_at_100
581
+ value: 41.125
582
+ - type: mrr_at_1000
583
+ value: 41.186
584
+ - type: mrr_at_3
585
+ value: 37.937
586
+ - type: mrr_at_5
587
+ value: 39.463
588
+ - type: ndcg_at_1
589
+ value: 31.761
590
+ - type: ndcg_at_10
591
+ value: 40.422000000000004
592
+ - type: ndcg_at_100
593
+ value: 45.458999999999996
594
+ - type: ndcg_at_1000
595
+ value: 47.951
596
+ - type: ndcg_at_3
597
+ value: 35.972
598
+ - type: ndcg_at_5
599
+ value: 38.272
600
+ - type: precision_at_1
601
+ value: 31.761
602
+ - type: precision_at_10
603
+ value: 7.103
604
+ - type: precision_at_100
605
+ value: 1.133
606
+ - type: precision_at_1000
607
+ value: 0.152
608
+ - type: precision_at_3
609
+ value: 16.779
610
+ - type: precision_at_5
611
+ value: 11.877
612
+ - type: recall_at_1
613
+ value: 26.268
614
+ - type: recall_at_10
615
+ value: 51.053000000000004
616
+ - type: recall_at_100
617
+ value: 72.702
618
+ - type: recall_at_1000
619
+ value: 89.521
620
+ - type: recall_at_3
621
+ value: 38.619
622
+ - type: recall_at_5
623
+ value: 44.671
624
+ - task:
625
+ type: Retrieval
626
+ dataset:
627
+ type: BeIR/cqadupstack
628
+ name: MTEB CQADupstackProgrammersRetrieval
629
+ config: default
630
+ split: test
631
+ revision: None
632
+ metrics:
633
+ - type: map_at_1
634
+ value: 25.230999999999998
635
+ - type: map_at_10
636
+ value: 34.227000000000004
637
+ - type: map_at_100
638
+ value: 35.370000000000005
639
+ - type: map_at_1000
640
+ value: 35.488
641
+ - type: map_at_3
642
+ value: 31.496000000000002
643
+ - type: map_at_5
644
+ value: 33.034
645
+ - type: mrr_at_1
646
+ value: 30.822
647
+ - type: mrr_at_10
648
+ value: 39.045
649
+ - type: mrr_at_100
650
+ value: 39.809
651
+ - type: mrr_at_1000
652
+ value: 39.873
653
+ - type: mrr_at_3
654
+ value: 36.663000000000004
655
+ - type: mrr_at_5
656
+ value: 37.964
657
+ - type: ndcg_at_1
658
+ value: 30.822
659
+ - type: ndcg_at_10
660
+ value: 39.472
661
+ - type: ndcg_at_100
662
+ value: 44.574999999999996
663
+ - type: ndcg_at_1000
664
+ value: 47.162
665
+ - type: ndcg_at_3
666
+ value: 34.929
667
+ - type: ndcg_at_5
668
+ value: 37.002
669
+ - type: precision_at_1
670
+ value: 30.822
671
+ - type: precision_at_10
672
+ value: 7.055
673
+ - type: precision_at_100
674
+ value: 1.124
675
+ - type: precision_at_1000
676
+ value: 0.152
677
+ - type: precision_at_3
678
+ value: 16.591
679
+ - type: precision_at_5
680
+ value: 11.667
681
+ - type: recall_at_1
682
+ value: 25.230999999999998
683
+ - type: recall_at_10
684
+ value: 50.42100000000001
685
+ - type: recall_at_100
686
+ value: 72.685
687
+ - type: recall_at_1000
688
+ value: 90.469
689
+ - type: recall_at_3
690
+ value: 37.503
691
+ - type: recall_at_5
692
+ value: 43.123
693
+ - task:
694
+ type: Retrieval
695
+ dataset:
696
+ type: BeIR/cqadupstack
697
+ name: MTEB CQADupstackRetrieval
698
+ config: default
699
+ split: test
700
+ revision: None
701
+ metrics:
702
+ - type: map_at_1
703
+ value: 24.604166666666664
704
+ - type: map_at_10
705
+ value: 32.427166666666665
706
+ - type: map_at_100
707
+ value: 33.51474999999999
708
+ - type: map_at_1000
709
+ value: 33.6345
710
+ - type: map_at_3
711
+ value: 30.02366666666667
712
+ - type: map_at_5
713
+ value: 31.382333333333328
714
+ - type: mrr_at_1
715
+ value: 29.001166666666666
716
+ - type: mrr_at_10
717
+ value: 36.3315
718
+ - type: mrr_at_100
719
+ value: 37.16683333333333
720
+ - type: mrr_at_1000
721
+ value: 37.23341666666668
722
+ - type: mrr_at_3
723
+ value: 34.19916666666667
724
+ - type: mrr_at_5
725
+ value: 35.40458333333334
726
+ - type: ndcg_at_1
727
+ value: 29.001166666666666
728
+ - type: ndcg_at_10
729
+ value: 37.06883333333334
730
+ - type: ndcg_at_100
731
+ value: 41.95816666666666
732
+ - type: ndcg_at_1000
733
+ value: 44.501583333333336
734
+ - type: ndcg_at_3
735
+ value: 32.973499999999994
736
+ - type: ndcg_at_5
737
+ value: 34.90833333333334
738
+ - type: precision_at_1
739
+ value: 29.001166666666666
740
+ - type: precision_at_10
741
+ value: 6.336
742
+ - type: precision_at_100
743
+ value: 1.0282499999999999
744
+ - type: precision_at_1000
745
+ value: 0.14391666666666664
746
+ - type: precision_at_3
747
+ value: 14.932499999999996
748
+ - type: precision_at_5
749
+ value: 10.50825
750
+ - type: recall_at_1
751
+ value: 24.604166666666664
752
+ - type: recall_at_10
753
+ value: 46.9525
754
+ - type: recall_at_100
755
+ value: 68.67816666666667
756
+ - type: recall_at_1000
757
+ value: 86.59783333333334
758
+ - type: recall_at_3
759
+ value: 35.49783333333333
760
+ - type: recall_at_5
761
+ value: 40.52525000000001
762
+ - task:
763
+ type: Retrieval
764
+ dataset:
765
+ type: BeIR/cqadupstack
766
+ name: MTEB CQADupstackStatsRetrieval
767
+ config: default
768
+ split: test
769
+ revision: None
770
+ metrics:
771
+ - type: map_at_1
772
+ value: 23.559
773
+ - type: map_at_10
774
+ value: 29.023
775
+ - type: map_at_100
776
+ value: 29.818
777
+ - type: map_at_1000
778
+ value: 29.909000000000002
779
+ - type: map_at_3
780
+ value: 27.037
781
+ - type: map_at_5
782
+ value: 28.225
783
+ - type: mrr_at_1
784
+ value: 26.994
785
+ - type: mrr_at_10
786
+ value: 31.962000000000003
787
+ - type: mrr_at_100
788
+ value: 32.726
789
+ - type: mrr_at_1000
790
+ value: 32.800000000000004
791
+ - type: mrr_at_3
792
+ value: 30.266
793
+ - type: mrr_at_5
794
+ value: 31.208999999999996
795
+ - type: ndcg_at_1
796
+ value: 26.994
797
+ - type: ndcg_at_10
798
+ value: 32.53
799
+ - type: ndcg_at_100
800
+ value: 36.758
801
+ - type: ndcg_at_1000
802
+ value: 39.362
803
+ - type: ndcg_at_3
804
+ value: 28.985
805
+ - type: ndcg_at_5
806
+ value: 30.757
807
+ - type: precision_at_1
808
+ value: 26.994
809
+ - type: precision_at_10
810
+ value: 4.968999999999999
811
+ - type: precision_at_100
812
+ value: 0.759
813
+ - type: precision_at_1000
814
+ value: 0.106
815
+ - type: precision_at_3
816
+ value: 12.219
817
+ - type: precision_at_5
818
+ value: 8.527999999999999
819
+ - type: recall_at_1
820
+ value: 23.559
821
+ - type: recall_at_10
822
+ value: 40.585
823
+ - type: recall_at_100
824
+ value: 60.306000000000004
825
+ - type: recall_at_1000
826
+ value: 80.11
827
+ - type: recall_at_3
828
+ value: 30.794
829
+ - type: recall_at_5
830
+ value: 35.186
831
+ - task:
832
+ type: Retrieval
833
+ dataset:
834
+ type: BeIR/cqadupstack
835
+ name: MTEB CQADupstackTexRetrieval
836
+ config: default
837
+ split: test
838
+ revision: None
839
+ metrics:
840
+ - type: map_at_1
841
+ value: 16.384999999999998
842
+ - type: map_at_10
843
+ value: 22.142
844
+ - type: map_at_100
845
+ value: 23.057
846
+ - type: map_at_1000
847
+ value: 23.177
848
+ - type: map_at_3
849
+ value: 20.29
850
+ - type: map_at_5
851
+ value: 21.332
852
+ - type: mrr_at_1
853
+ value: 19.89
854
+ - type: mrr_at_10
855
+ value: 25.771
856
+ - type: mrr_at_100
857
+ value: 26.599
858
+ - type: mrr_at_1000
859
+ value: 26.680999999999997
860
+ - type: mrr_at_3
861
+ value: 23.962
862
+ - type: mrr_at_5
863
+ value: 24.934
864
+ - type: ndcg_at_1
865
+ value: 19.89
866
+ - type: ndcg_at_10
867
+ value: 25.97
868
+ - type: ndcg_at_100
869
+ value: 30.605
870
+ - type: ndcg_at_1000
871
+ value: 33.619
872
+ - type: ndcg_at_3
873
+ value: 22.704
874
+ - type: ndcg_at_5
875
+ value: 24.199
876
+ - type: precision_at_1
877
+ value: 19.89
878
+ - type: precision_at_10
879
+ value: 4.553
880
+ - type: precision_at_100
881
+ value: 0.8049999999999999
882
+ - type: precision_at_1000
883
+ value: 0.122
884
+ - type: precision_at_3
885
+ value: 10.541
886
+ - type: precision_at_5
887
+ value: 7.46
888
+ - type: recall_at_1
889
+ value: 16.384999999999998
890
+ - type: recall_at_10
891
+ value: 34.001
892
+ - type: recall_at_100
893
+ value: 55.17100000000001
894
+ - type: recall_at_1000
895
+ value: 77.125
896
+ - type: recall_at_3
897
+ value: 24.618000000000002
898
+ - type: recall_at_5
899
+ value: 28.695999999999998
900
+ - task:
901
+ type: Retrieval
902
+ dataset:
903
+ type: BeIR/cqadupstack
904
+ name: MTEB CQADupstackUnixRetrieval
905
+ config: default
906
+ split: test
907
+ revision: None
908
+ metrics:
909
+ - type: map_at_1
910
+ value: 23.726
911
+ - type: map_at_10
912
+ value: 31.227
913
+ - type: map_at_100
914
+ value: 32.311
915
+ - type: map_at_1000
916
+ value: 32.419
917
+ - type: map_at_3
918
+ value: 28.765
919
+ - type: map_at_5
920
+ value: 30.229
921
+ - type: mrr_at_1
922
+ value: 27.705000000000002
923
+ - type: mrr_at_10
924
+ value: 35.085
925
+ - type: mrr_at_100
926
+ value: 35.931000000000004
927
+ - type: mrr_at_1000
928
+ value: 36
929
+ - type: mrr_at_3
930
+ value: 32.603
931
+ - type: mrr_at_5
932
+ value: 34.117999999999995
933
+ - type: ndcg_at_1
934
+ value: 27.705000000000002
935
+ - type: ndcg_at_10
936
+ value: 35.968
937
+ - type: ndcg_at_100
938
+ value: 41.197
939
+ - type: ndcg_at_1000
940
+ value: 43.76
941
+ - type: ndcg_at_3
942
+ value: 31.304
943
+ - type: ndcg_at_5
944
+ value: 33.661
945
+ - type: precision_at_1
946
+ value: 27.705000000000002
947
+ - type: precision_at_10
948
+ value: 5.942
949
+ - type: precision_at_100
950
+ value: 0.964
951
+ - type: precision_at_1000
952
+ value: 0.13
953
+ - type: precision_at_3
954
+ value: 13.868
955
+ - type: precision_at_5
956
+ value: 9.944
957
+ - type: recall_at_1
958
+ value: 23.726
959
+ - type: recall_at_10
960
+ value: 46.786
961
+ - type: recall_at_100
962
+ value: 70.072
963
+ - type: recall_at_1000
964
+ value: 88.2
965
+ - type: recall_at_3
966
+ value: 33.981
967
+ - type: recall_at_5
968
+ value: 39.893
969
+ - task:
970
+ type: Retrieval
971
+ dataset:
972
+ type: BeIR/cqadupstack
973
+ name: MTEB CQADupstackWebmastersRetrieval
974
+ config: default
975
+ split: test
976
+ revision: None
977
+ metrics:
978
+ - type: map_at_1
979
+ value: 23.344
980
+ - type: map_at_10
981
+ value: 31.636999999999997
982
+ - type: map_at_100
983
+ value: 33.065
984
+ - type: map_at_1000
985
+ value: 33.300000000000004
986
+ - type: map_at_3
987
+ value: 29.351
988
+ - type: map_at_5
989
+ value: 30.432
990
+ - type: mrr_at_1
991
+ value: 27.866000000000003
992
+ - type: mrr_at_10
993
+ value: 35.587
994
+ - type: mrr_at_100
995
+ value: 36.52
996
+ - type: mrr_at_1000
997
+ value: 36.597
998
+ - type: mrr_at_3
999
+ value: 33.696
1000
+ - type: mrr_at_5
1001
+ value: 34.713
1002
+ - type: ndcg_at_1
1003
+ value: 27.866000000000003
1004
+ - type: ndcg_at_10
1005
+ value: 36.61
1006
+ - type: ndcg_at_100
1007
+ value: 41.88
1008
+ - type: ndcg_at_1000
1009
+ value: 45.105000000000004
1010
+ - type: ndcg_at_3
1011
+ value: 33.038000000000004
1012
+ - type: ndcg_at_5
1013
+ value: 34.331
1014
+ - type: precision_at_1
1015
+ value: 27.866000000000003
1016
+ - type: precision_at_10
1017
+ value: 6.917
1018
+ - type: precision_at_100
1019
+ value: 1.3599999999999999
1020
+ - type: precision_at_1000
1021
+ value: 0.233
1022
+ - type: precision_at_3
1023
+ value: 15.547
1024
+ - type: precision_at_5
1025
+ value: 10.791
1026
+ - type: recall_at_1
1027
+ value: 23.344
1028
+ - type: recall_at_10
1029
+ value: 45.782000000000004
1030
+ - type: recall_at_100
1031
+ value: 69.503
1032
+ - type: recall_at_1000
1033
+ value: 90.742
1034
+ - type: recall_at_3
1035
+ value: 35.160000000000004
1036
+ - type: recall_at_5
1037
+ value: 39.058
1038
+ - task:
1039
+ type: Retrieval
1040
+ dataset:
1041
+ type: BeIR/cqadupstack
1042
+ name: MTEB CQADupstackWordpressRetrieval
1043
+ config: default
1044
+ split: test
1045
+ revision: None
1046
+ metrics:
1047
+ - type: map_at_1
1048
+ value: 20.776
1049
+ - type: map_at_10
1050
+ value: 27.285999999999998
1051
+ - type: map_at_100
1052
+ value: 28.235
1053
+ - type: map_at_1000
1054
+ value: 28.337
1055
+ - type: map_at_3
1056
+ value: 25.147000000000002
1057
+ - type: map_at_5
1058
+ value: 26.401999999999997
1059
+ - type: mrr_at_1
1060
+ value: 22.921
1061
+ - type: mrr_at_10
1062
+ value: 29.409999999999997
1063
+ - type: mrr_at_100
1064
+ value: 30.275000000000002
1065
+ - type: mrr_at_1000
1066
+ value: 30.354999999999997
1067
+ - type: mrr_at_3
1068
+ value: 27.418
1069
+ - type: mrr_at_5
1070
+ value: 28.592000000000002
1071
+ - type: ndcg_at_1
1072
+ value: 22.921
1073
+ - type: ndcg_at_10
1074
+ value: 31.239
1075
+ - type: ndcg_at_100
1076
+ value: 35.965
1077
+ - type: ndcg_at_1000
1078
+ value: 38.602
1079
+ - type: ndcg_at_3
1080
+ value: 27.174
1081
+ - type: ndcg_at_5
1082
+ value: 29.229
1083
+ - type: precision_at_1
1084
+ value: 22.921
1085
+ - type: precision_at_10
1086
+ value: 4.806
1087
+ - type: precision_at_100
1088
+ value: 0.776
1089
+ - type: precision_at_1000
1090
+ value: 0.11
1091
+ - type: precision_at_3
1092
+ value: 11.459999999999999
1093
+ - type: precision_at_5
1094
+ value: 8.022
1095
+ - type: recall_at_1
1096
+ value: 20.776
1097
+ - type: recall_at_10
1098
+ value: 41.294
1099
+ - type: recall_at_100
1100
+ value: 63.111
1101
+ - type: recall_at_1000
1102
+ value: 82.88600000000001
1103
+ - type: recall_at_3
1104
+ value: 30.403000000000002
1105
+ - type: recall_at_5
1106
+ value: 35.455999999999996
1107
+ - task:
1108
+ type: Retrieval
1109
+ dataset:
1110
+ type: climate-fever
1111
+ name: MTEB ClimateFEVER
1112
+ config: default
1113
+ split: test
1114
+ revision: None
1115
+ metrics:
1116
+ - type: map_at_1
1117
+ value: 9.376
1118
+ - type: map_at_10
1119
+ value: 15.926000000000002
1120
+ - type: map_at_100
1121
+ value: 17.585
1122
+ - type: map_at_1000
1123
+ value: 17.776
1124
+ - type: map_at_3
1125
+ value: 13.014000000000001
1126
+ - type: map_at_5
1127
+ value: 14.417
1128
+ - type: mrr_at_1
1129
+ value: 20.195
1130
+ - type: mrr_at_10
1131
+ value: 29.95
1132
+ - type: mrr_at_100
1133
+ value: 31.052000000000003
1134
+ - type: mrr_at_1000
1135
+ value: 31.108000000000004
1136
+ - type: mrr_at_3
1137
+ value: 26.667
1138
+ - type: mrr_at_5
1139
+ value: 28.458
1140
+ - type: ndcg_at_1
1141
+ value: 20.195
1142
+ - type: ndcg_at_10
1143
+ value: 22.871
1144
+ - type: ndcg_at_100
1145
+ value: 29.921999999999997
1146
+ - type: ndcg_at_1000
1147
+ value: 33.672999999999995
1148
+ - type: ndcg_at_3
1149
+ value: 17.782999999999998
1150
+ - type: ndcg_at_5
1151
+ value: 19.544
1152
+ - type: precision_at_1
1153
+ value: 20.195
1154
+ - type: precision_at_10
1155
+ value: 7.394
1156
+ - type: precision_at_100
1157
+ value: 1.493
1158
+ - type: precision_at_1000
1159
+ value: 0.218
1160
+ - type: precision_at_3
1161
+ value: 13.073
1162
+ - type: precision_at_5
1163
+ value: 10.436
1164
+ - type: recall_at_1
1165
+ value: 9.376
1166
+ - type: recall_at_10
1167
+ value: 28.544999999999998
1168
+ - type: recall_at_100
1169
+ value: 53.147999999999996
1170
+ - type: recall_at_1000
1171
+ value: 74.62
1172
+ - type: recall_at_3
1173
+ value: 16.464000000000002
1174
+ - type: recall_at_5
1175
+ value: 21.004
1176
+ - task:
1177
+ type: Retrieval
1178
+ dataset:
1179
+ type: dbpedia-entity
1180
+ name: MTEB DBPedia
1181
+ config: default
1182
+ split: test
1183
+ revision: None
1184
+ metrics:
1185
+ - type: map_at_1
1186
+ value: 8.415000000000001
1187
+ - type: map_at_10
1188
+ value: 18.738
1189
+ - type: map_at_100
1190
+ value: 27.291999999999998
1191
+ - type: map_at_1000
1192
+ value: 28.992
1193
+ - type: map_at_3
1194
+ value: 13.196
1195
+ - type: map_at_5
1196
+ value: 15.539
1197
+ - type: mrr_at_1
1198
+ value: 66.5
1199
+ - type: mrr_at_10
1200
+ value: 74.518
1201
+ - type: mrr_at_100
1202
+ value: 74.86
1203
+ - type: mrr_at_1000
1204
+ value: 74.87
1205
+ - type: mrr_at_3
1206
+ value: 72.375
1207
+ - type: mrr_at_5
1208
+ value: 73.86200000000001
1209
+ - type: ndcg_at_1
1210
+ value: 54.37499999999999
1211
+ - type: ndcg_at_10
1212
+ value: 41.317
1213
+ - type: ndcg_at_100
1214
+ value: 45.845
1215
+ - type: ndcg_at_1000
1216
+ value: 52.92
1217
+ - type: ndcg_at_3
1218
+ value: 44.983000000000004
1219
+ - type: ndcg_at_5
1220
+ value: 42.989
1221
+ - type: precision_at_1
1222
+ value: 66.5
1223
+ - type: precision_at_10
1224
+ value: 33.6
1225
+ - type: precision_at_100
1226
+ value: 10.972999999999999
1227
+ - type: precision_at_1000
1228
+ value: 2.214
1229
+ - type: precision_at_3
1230
+ value: 48.583
1231
+ - type: precision_at_5
1232
+ value: 42.15
1233
+ - type: recall_at_1
1234
+ value: 8.415000000000001
1235
+ - type: recall_at_10
1236
+ value: 24.953
1237
+ - type: recall_at_100
1238
+ value: 52.48199999999999
1239
+ - type: recall_at_1000
1240
+ value: 75.093
1241
+ - type: recall_at_3
1242
+ value: 14.341000000000001
1243
+ - type: recall_at_5
1244
+ value: 18.468
1245
+ - task:
1246
+ type: Classification
1247
+ dataset:
1248
+ type: mteb/emotion
1249
+ name: MTEB EmotionClassification
1250
+ config: default
1251
+ split: test
1252
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1253
+ metrics:
1254
+ - type: accuracy
1255
+ value: 47.06499999999999
1256
+ - type: f1
1257
+ value: 41.439327599975385
1258
+ - task:
1259
+ type: Retrieval
1260
+ dataset:
1261
+ type: fever
1262
+ name: MTEB FEVER
1263
+ config: default
1264
+ split: test
1265
+ revision: None
1266
+ metrics:
1267
+ - type: map_at_1
1268
+ value: 66.02
1269
+ - type: map_at_10
1270
+ value: 76.68599999999999
1271
+ - type: map_at_100
1272
+ value: 76.959
1273
+ - type: map_at_1000
1274
+ value: 76.972
1275
+ - type: map_at_3
1276
+ value: 75.024
1277
+ - type: map_at_5
1278
+ value: 76.153
1279
+ - type: mrr_at_1
1280
+ value: 71.197
1281
+ - type: mrr_at_10
1282
+ value: 81.105
1283
+ - type: mrr_at_100
1284
+ value: 81.232
1285
+ - type: mrr_at_1000
1286
+ value: 81.233
1287
+ - type: mrr_at_3
1288
+ value: 79.758
1289
+ - type: mrr_at_5
1290
+ value: 80.69
1291
+ - type: ndcg_at_1
1292
+ value: 71.197
1293
+ - type: ndcg_at_10
1294
+ value: 81.644
1295
+ - type: ndcg_at_100
1296
+ value: 82.645
1297
+ - type: ndcg_at_1000
1298
+ value: 82.879
1299
+ - type: ndcg_at_3
1300
+ value: 78.792
1301
+ - type: ndcg_at_5
1302
+ value: 80.528
1303
+ - type: precision_at_1
1304
+ value: 71.197
1305
+ - type: precision_at_10
1306
+ value: 10.206999999999999
1307
+ - type: precision_at_100
1308
+ value: 1.093
1309
+ - type: precision_at_1000
1310
+ value: 0.11299999999999999
1311
+ - type: precision_at_3
1312
+ value: 30.868000000000002
1313
+ - type: precision_at_5
1314
+ value: 19.559
1315
+ - type: recall_at_1
1316
+ value: 66.02
1317
+ - type: recall_at_10
1318
+ value: 92.50699999999999
1319
+ - type: recall_at_100
1320
+ value: 96.497
1321
+ - type: recall_at_1000
1322
+ value: 97.956
1323
+ - type: recall_at_3
1324
+ value: 84.866
1325
+ - type: recall_at_5
1326
+ value: 89.16199999999999
1327
+ - task:
1328
+ type: Retrieval
1329
+ dataset:
1330
+ type: fiqa
1331
+ name: MTEB FiQA2018
1332
+ config: default
1333
+ split: test
1334
+ revision: None
1335
+ metrics:
1336
+ - type: map_at_1
1337
+ value: 17.948
1338
+ - type: map_at_10
1339
+ value: 29.833
1340
+ - type: map_at_100
1341
+ value: 31.487
1342
+ - type: map_at_1000
1343
+ value: 31.674000000000003
1344
+ - type: map_at_3
1345
+ value: 26.029999999999998
1346
+ - type: map_at_5
1347
+ value: 28.038999999999998
1348
+ - type: mrr_at_1
1349
+ value: 34.721999999999994
1350
+ - type: mrr_at_10
1351
+ value: 44.214999999999996
1352
+ - type: mrr_at_100
1353
+ value: 44.994
1354
+ - type: mrr_at_1000
1355
+ value: 45.051
1356
+ - type: mrr_at_3
1357
+ value: 41.667
1358
+ - type: mrr_at_5
1359
+ value: 43.032
1360
+ - type: ndcg_at_1
1361
+ value: 34.721999999999994
1362
+ - type: ndcg_at_10
1363
+ value: 37.434
1364
+ - type: ndcg_at_100
1365
+ value: 43.702000000000005
1366
+ - type: ndcg_at_1000
1367
+ value: 46.993
1368
+ - type: ndcg_at_3
1369
+ value: 33.56
1370
+ - type: ndcg_at_5
1371
+ value: 34.687
1372
+ - type: precision_at_1
1373
+ value: 34.721999999999994
1374
+ - type: precision_at_10
1375
+ value: 10.401
1376
+ - type: precision_at_100
1377
+ value: 1.7049999999999998
1378
+ - type: precision_at_1000
1379
+ value: 0.22799999999999998
1380
+ - type: precision_at_3
1381
+ value: 22.531000000000002
1382
+ - type: precision_at_5
1383
+ value: 16.42
1384
+ - type: recall_at_1
1385
+ value: 17.948
1386
+ - type: recall_at_10
1387
+ value: 45.062999999999995
1388
+ - type: recall_at_100
1389
+ value: 68.191
1390
+ - type: recall_at_1000
1391
+ value: 87.954
1392
+ - type: recall_at_3
1393
+ value: 31.112000000000002
1394
+ - type: recall_at_5
1395
+ value: 36.823
1396
+ - task:
1397
+ type: Retrieval
1398
+ dataset:
1399
+ type: hotpotqa
1400
+ name: MTEB HotpotQA
1401
+ config: default
1402
+ split: test
1403
+ revision: None
1404
+ metrics:
1405
+ - type: map_at_1
1406
+ value: 36.644
1407
+ - type: map_at_10
1408
+ value: 57.658
1409
+ - type: map_at_100
1410
+ value: 58.562000000000005
1411
+ - type: map_at_1000
1412
+ value: 58.62500000000001
1413
+ - type: map_at_3
1414
+ value: 54.022999999999996
1415
+ - type: map_at_5
1416
+ value: 56.293000000000006
1417
+ - type: mrr_at_1
1418
+ value: 73.288
1419
+ - type: mrr_at_10
1420
+ value: 80.51700000000001
1421
+ - type: mrr_at_100
1422
+ value: 80.72
1423
+ - type: mrr_at_1000
1424
+ value: 80.728
1425
+ - type: mrr_at_3
1426
+ value: 79.33200000000001
1427
+ - type: mrr_at_5
1428
+ value: 80.085
1429
+ - type: ndcg_at_1
1430
+ value: 73.288
1431
+ - type: ndcg_at_10
1432
+ value: 66.61
1433
+ - type: ndcg_at_100
1434
+ value: 69.723
1435
+ - type: ndcg_at_1000
1436
+ value: 70.96000000000001
1437
+ - type: ndcg_at_3
1438
+ value: 61.358999999999995
1439
+ - type: ndcg_at_5
1440
+ value: 64.277
1441
+ - type: precision_at_1
1442
+ value: 73.288
1443
+ - type: precision_at_10
1444
+ value: 14.17
1445
+ - type: precision_at_100
1446
+ value: 1.659
1447
+ - type: precision_at_1000
1448
+ value: 0.182
1449
+ - type: precision_at_3
1450
+ value: 39.487
1451
+ - type: precision_at_5
1452
+ value: 25.999
1453
+ - type: recall_at_1
1454
+ value: 36.644
1455
+ - type: recall_at_10
1456
+ value: 70.851
1457
+ - type: recall_at_100
1458
+ value: 82.94399999999999
1459
+ - type: recall_at_1000
1460
+ value: 91.134
1461
+ - type: recall_at_3
1462
+ value: 59.230000000000004
1463
+ - type: recall_at_5
1464
+ value: 64.997
1465
+ - task:
1466
+ type: Classification
1467
+ dataset:
1468
+ type: mteb/imdb
1469
+ name: MTEB ImdbClassification
1470
+ config: default
1471
+ split: test
1472
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1473
+ metrics:
1474
+ - type: accuracy
1475
+ value: 86.00280000000001
1476
+ - type: ap
1477
+ value: 80.46302061021223
1478
+ - type: f1
1479
+ value: 85.9592921596419
1480
+ - task:
1481
+ type: Retrieval
1482
+ dataset:
1483
+ type: msmarco
1484
+ name: MTEB MSMARCO
1485
+ config: default
1486
+ split: dev
1487
+ revision: None
1488
+ metrics:
1489
+ - type: map_at_1
1490
+ value: 22.541
1491
+ - type: map_at_10
1492
+ value: 34.625
1493
+ - type: map_at_100
1494
+ value: 35.785
1495
+ - type: map_at_1000
1496
+ value: 35.831
1497
+ - type: map_at_3
1498
+ value: 30.823
1499
+ - type: map_at_5
1500
+ value: 32.967999999999996
1501
+ - type: mrr_at_1
1502
+ value: 23.180999999999997
1503
+ - type: mrr_at_10
1504
+ value: 35.207
1505
+ - type: mrr_at_100
1506
+ value: 36.315
1507
+ - type: mrr_at_1000
1508
+ value: 36.355
1509
+ - type: mrr_at_3
1510
+ value: 31.483
1511
+ - type: mrr_at_5
1512
+ value: 33.589999999999996
1513
+ - type: ndcg_at_1
1514
+ value: 23.195
1515
+ - type: ndcg_at_10
1516
+ value: 41.461
1517
+ - type: ndcg_at_100
1518
+ value: 47.032000000000004
1519
+ - type: ndcg_at_1000
1520
+ value: 48.199999999999996
1521
+ - type: ndcg_at_3
1522
+ value: 33.702
1523
+ - type: ndcg_at_5
1524
+ value: 37.522
1525
+ - type: precision_at_1
1526
+ value: 23.195
1527
+ - type: precision_at_10
1528
+ value: 6.526999999999999
1529
+ - type: precision_at_100
1530
+ value: 0.932
1531
+ - type: precision_at_1000
1532
+ value: 0.10300000000000001
1533
+ - type: precision_at_3
1534
+ value: 14.308000000000002
1535
+ - type: precision_at_5
1536
+ value: 10.507
1537
+ - type: recall_at_1
1538
+ value: 22.541
1539
+ - type: recall_at_10
1540
+ value: 62.524
1541
+ - type: recall_at_100
1542
+ value: 88.228
1543
+ - type: recall_at_1000
1544
+ value: 97.243
1545
+ - type: recall_at_3
1546
+ value: 41.38
1547
+ - type: recall_at_5
1548
+ value: 50.55
1549
+ - task:
1550
+ type: Classification
1551
+ dataset:
1552
+ type: mteb/mtop_domain
1553
+ name: MTEB MTOPDomainClassification (en)
1554
+ config: en
1555
+ split: test
1556
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1557
+ metrics:
1558
+ - type: accuracy
1559
+ value: 92.69949840401279
1560
+ - type: f1
1561
+ value: 92.54141471311786
1562
+ - task:
1563
+ type: Classification
1564
+ dataset:
1565
+ type: mteb/mtop_intent
1566
+ name: MTEB MTOPIntentClassification (en)
1567
+ config: en
1568
+ split: test
1569
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1570
+ metrics:
1571
+ - type: accuracy
1572
+ value: 72.56041951664386
1573
+ - type: f1
1574
+ value: 55.88499977508287
1575
+ - task:
1576
+ type: Classification
1577
+ dataset:
1578
+ type: mteb/amazon_massive_intent
1579
+ name: MTEB MassiveIntentClassification (en)
1580
+ config: en
1581
+ split: test
1582
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1583
+ metrics:
1584
+ - type: accuracy
1585
+ value: 71.62071284465365
1586
+ - type: f1
1587
+ value: 69.36717546572152
1588
+ - task:
1589
+ type: Classification
1590
+ dataset:
1591
+ type: mteb/amazon_massive_scenario
1592
+ name: MTEB MassiveScenarioClassification (en)
1593
+ config: en
1594
+ split: test
1595
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1596
+ metrics:
1597
+ - type: accuracy
1598
+ value: 76.35843981170142
1599
+ - type: f1
1600
+ value: 76.15496453538884
1601
+ - task:
1602
+ type: Clustering
1603
+ dataset:
1604
+ type: mteb/medrxiv-clustering-p2p
1605
+ name: MTEB MedrxivClusteringP2P
1606
+ config: default
1607
+ split: test
1608
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1609
+ metrics:
1610
+ - type: v_measure
1611
+ value: 31.33664956793118
1612
+ - task:
1613
+ type: Clustering
1614
+ dataset:
1615
+ type: mteb/medrxiv-clustering-s2s
1616
+ name: MTEB MedrxivClusteringS2S
1617
+ config: default
1618
+ split: test
1619
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1620
+ metrics:
1621
+ - type: v_measure
1622
+ value: 27.883839621715524
1623
+ - task:
1624
+ type: Reranking
1625
+ dataset:
1626
+ type: mteb/mind_small
1627
+ name: MTEB MindSmallReranking
1628
+ config: default
1629
+ split: test
1630
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1631
+ metrics:
1632
+ - type: map
1633
+ value: 30.096874986740758
1634
+ - type: mrr
1635
+ value: 30.97300481932132
1636
+ - task:
1637
+ type: Retrieval
1638
+ dataset:
1639
+ type: nfcorpus
1640
+ name: MTEB NFCorpus
1641
+ config: default
1642
+ split: test
1643
+ revision: None
1644
+ metrics:
1645
+ - type: map_at_1
1646
+ value: 5.4
1647
+ - type: map_at_10
1648
+ value: 11.852
1649
+ - type: map_at_100
1650
+ value: 14.758
1651
+ - type: map_at_1000
1652
+ value: 16.134
1653
+ - type: map_at_3
1654
+ value: 8.558
1655
+ - type: map_at_5
1656
+ value: 10.087
1657
+ - type: mrr_at_1
1658
+ value: 44.272
1659
+ - type: mrr_at_10
1660
+ value: 52.05800000000001
1661
+ - type: mrr_at_100
1662
+ value: 52.689
1663
+ - type: mrr_at_1000
1664
+ value: 52.742999999999995
1665
+ - type: mrr_at_3
1666
+ value: 50.205999999999996
1667
+ - type: mrr_at_5
1668
+ value: 51.367
1669
+ - type: ndcg_at_1
1670
+ value: 42.57
1671
+ - type: ndcg_at_10
1672
+ value: 32.449
1673
+ - type: ndcg_at_100
1674
+ value: 29.596
1675
+ - type: ndcg_at_1000
1676
+ value: 38.351
1677
+ - type: ndcg_at_3
1678
+ value: 37.044
1679
+ - type: ndcg_at_5
1680
+ value: 35.275
1681
+ - type: precision_at_1
1682
+ value: 44.272
1683
+ - type: precision_at_10
1684
+ value: 23.87
1685
+ - type: precision_at_100
1686
+ value: 7.625
1687
+ - type: precision_at_1000
1688
+ value: 2.045
1689
+ - type: precision_at_3
1690
+ value: 34.365
1691
+ - type: precision_at_5
1692
+ value: 30.341
1693
+ - type: recall_at_1
1694
+ value: 5.4
1695
+ - type: recall_at_10
1696
+ value: 15.943999999999999
1697
+ - type: recall_at_100
1698
+ value: 29.805
1699
+ - type: recall_at_1000
1700
+ value: 61.695
1701
+ - type: recall_at_3
1702
+ value: 9.539
1703
+ - type: recall_at_5
1704
+ value: 12.127
1705
+ - task:
1706
+ type: Retrieval
1707
+ dataset:
1708
+ type: nq
1709
+ name: MTEB NQ
1710
+ config: default
1711
+ split: test
1712
+ revision: None
1713
+ metrics:
1714
+ - type: map_at_1
1715
+ value: 36.047000000000004
1716
+ - type: map_at_10
1717
+ value: 51.6
1718
+ - type: map_at_100
1719
+ value: 52.449999999999996
1720
+ - type: map_at_1000
1721
+ value: 52.476
1722
+ - type: map_at_3
1723
+ value: 47.452
1724
+ - type: map_at_5
1725
+ value: 49.964
1726
+ - type: mrr_at_1
1727
+ value: 40.382
1728
+ - type: mrr_at_10
1729
+ value: 54.273
1730
+ - type: mrr_at_100
1731
+ value: 54.859
1732
+ - type: mrr_at_1000
1733
+ value: 54.876000000000005
1734
+ - type: mrr_at_3
1735
+ value: 51.014
1736
+ - type: mrr_at_5
1737
+ value: 52.983999999999995
1738
+ - type: ndcg_at_1
1739
+ value: 40.353
1740
+ - type: ndcg_at_10
1741
+ value: 59.11300000000001
1742
+ - type: ndcg_at_100
1743
+ value: 62.604000000000006
1744
+ - type: ndcg_at_1000
1745
+ value: 63.187000000000005
1746
+ - type: ndcg_at_3
1747
+ value: 51.513
1748
+ - type: ndcg_at_5
1749
+ value: 55.576
1750
+ - type: precision_at_1
1751
+ value: 40.353
1752
+ - type: precision_at_10
1753
+ value: 9.418
1754
+ - type: precision_at_100
1755
+ value: 1.1440000000000001
1756
+ - type: precision_at_1000
1757
+ value: 0.12
1758
+ - type: precision_at_3
1759
+ value: 23.078000000000003
1760
+ - type: precision_at_5
1761
+ value: 16.250999999999998
1762
+ - type: recall_at_1
1763
+ value: 36.047000000000004
1764
+ - type: recall_at_10
1765
+ value: 79.22200000000001
1766
+ - type: recall_at_100
1767
+ value: 94.23
1768
+ - type: recall_at_1000
1769
+ value: 98.51100000000001
1770
+ - type: recall_at_3
1771
+ value: 59.678
1772
+ - type: recall_at_5
1773
+ value: 68.967
1774
+ - task:
1775
+ type: Retrieval
1776
+ dataset:
1777
+ type: quora
1778
+ name: MTEB QuoraRetrieval
1779
+ config: default
1780
+ split: test
1781
+ revision: None
1782
+ metrics:
1783
+ - type: map_at_1
1784
+ value: 68.232
1785
+ - type: map_at_10
1786
+ value: 81.674
1787
+ - type: map_at_100
1788
+ value: 82.338
1789
+ - type: map_at_1000
1790
+ value: 82.36099999999999
1791
+ - type: map_at_3
1792
+ value: 78.833
1793
+ - type: map_at_5
1794
+ value: 80.58
1795
+ - type: mrr_at_1
1796
+ value: 78.64
1797
+ - type: mrr_at_10
1798
+ value: 85.164
1799
+ - type: mrr_at_100
1800
+ value: 85.317
1801
+ - type: mrr_at_1000
1802
+ value: 85.319
1803
+ - type: mrr_at_3
1804
+ value: 84.127
1805
+ - type: mrr_at_5
1806
+ value: 84.789
1807
+ - type: ndcg_at_1
1808
+ value: 78.63
1809
+ - type: ndcg_at_10
1810
+ value: 85.711
1811
+ - type: ndcg_at_100
1812
+ value: 87.238
1813
+ - type: ndcg_at_1000
1814
+ value: 87.444
1815
+ - type: ndcg_at_3
1816
+ value: 82.788
1817
+ - type: ndcg_at_5
1818
+ value: 84.313
1819
+ - type: precision_at_1
1820
+ value: 78.63
1821
+ - type: precision_at_10
1822
+ value: 12.977
1823
+ - type: precision_at_100
1824
+ value: 1.503
1825
+ - type: precision_at_1000
1826
+ value: 0.156
1827
+ - type: precision_at_3
1828
+ value: 36.113
1829
+ - type: precision_at_5
1830
+ value: 23.71
1831
+ - type: recall_at_1
1832
+ value: 68.232
1833
+ - type: recall_at_10
1834
+ value: 93.30199999999999
1835
+ - type: recall_at_100
1836
+ value: 98.799
1837
+ - type: recall_at_1000
1838
+ value: 99.885
1839
+ - type: recall_at_3
1840
+ value: 84.827
1841
+ - type: recall_at_5
1842
+ value: 89.188
1843
+ - task:
1844
+ type: Clustering
1845
+ dataset:
1846
+ type: mteb/reddit-clustering
1847
+ name: MTEB RedditClustering
1848
+ config: default
1849
+ split: test
1850
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1851
+ metrics:
1852
+ - type: v_measure
1853
+ value: 45.71879170816294
1854
+ - task:
1855
+ type: Clustering
1856
+ dataset:
1857
+ type: mteb/reddit-clustering-p2p
1858
+ name: MTEB RedditClusteringP2P
1859
+ config: default
1860
+ split: test
1861
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1862
+ metrics:
1863
+ - type: v_measure
1864
+ value: 59.65866311751794
1865
+ - task:
1866
+ type: Retrieval
1867
+ dataset:
1868
+ type: scidocs
1869
+ name: MTEB SCIDOCS
1870
+ config: default
1871
+ split: test
1872
+ revision: None
1873
+ metrics:
1874
+ - type: map_at_1
1875
+ value: 4.218
1876
+ - type: map_at_10
1877
+ value: 10.337
1878
+ - type: map_at_100
1879
+ value: 12.131
1880
+ - type: map_at_1000
1881
+ value: 12.411
1882
+ - type: map_at_3
1883
+ value: 7.4270000000000005
1884
+ - type: map_at_5
1885
+ value: 8.913
1886
+ - type: mrr_at_1
1887
+ value: 20.8
1888
+ - type: mrr_at_10
1889
+ value: 30.868000000000002
1890
+ - type: mrr_at_100
1891
+ value: 31.903
1892
+ - type: mrr_at_1000
1893
+ value: 31.972
1894
+ - type: mrr_at_3
1895
+ value: 27.367
1896
+ - type: mrr_at_5
1897
+ value: 29.372
1898
+ - type: ndcg_at_1
1899
+ value: 20.8
1900
+ - type: ndcg_at_10
1901
+ value: 17.765
1902
+ - type: ndcg_at_100
1903
+ value: 24.914
1904
+ - type: ndcg_at_1000
1905
+ value: 30.206
1906
+ - type: ndcg_at_3
1907
+ value: 16.64
1908
+ - type: ndcg_at_5
1909
+ value: 14.712
1910
+ - type: precision_at_1
1911
+ value: 20.8
1912
+ - type: precision_at_10
1913
+ value: 9.24
1914
+ - type: precision_at_100
1915
+ value: 1.9560000000000002
1916
+ - type: precision_at_1000
1917
+ value: 0.32299999999999995
1918
+ - type: precision_at_3
1919
+ value: 15.467
1920
+ - type: precision_at_5
1921
+ value: 12.94
1922
+ - type: recall_at_1
1923
+ value: 4.218
1924
+ - type: recall_at_10
1925
+ value: 18.752
1926
+ - type: recall_at_100
1927
+ value: 39.7
1928
+ - type: recall_at_1000
1929
+ value: 65.57300000000001
1930
+ - type: recall_at_3
1931
+ value: 9.428
1932
+ - type: recall_at_5
1933
+ value: 13.133000000000001
1934
+ - task:
1935
+ type: STS
1936
+ dataset:
1937
+ type: mteb/sickr-sts
1938
+ name: MTEB SICK-R
1939
+ config: default
1940
+ split: test
1941
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1942
+ metrics:
1943
+ - type: cos_sim_pearson
1944
+ value: 83.04338850207233
1945
+ - type: cos_sim_spearman
1946
+ value: 78.5054651430423
1947
+ - type: euclidean_pearson
1948
+ value: 80.30739451228612
1949
+ - type: euclidean_spearman
1950
+ value: 78.48377464299097
1951
+ - type: manhattan_pearson
1952
+ value: 80.40795049052781
1953
+ - type: manhattan_spearman
1954
+ value: 78.49506205443114
1955
+ - task:
1956
+ type: STS
1957
+ dataset:
1958
+ type: mteb/sts12-sts
1959
+ name: MTEB STS12
1960
+ config: default
1961
+ split: test
1962
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1963
+ metrics:
1964
+ - type: cos_sim_pearson
1965
+ value: 84.11596224442962
1966
+ - type: cos_sim_spearman
1967
+ value: 76.20997388935461
1968
+ - type: euclidean_pearson
1969
+ value: 80.56858451349109
1970
+ - type: euclidean_spearman
1971
+ value: 75.92659183871186
1972
+ - type: manhattan_pearson
1973
+ value: 80.60246102203844
1974
+ - type: manhattan_spearman
1975
+ value: 76.03018971432664
1976
+ - task:
1977
+ type: STS
1978
+ dataset:
1979
+ type: mteb/sts13-sts
1980
+ name: MTEB STS13
1981
+ config: default
1982
+ split: test
1983
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1984
+ metrics:
1985
+ - type: cos_sim_pearson
1986
+ value: 81.34691640755737
1987
+ - type: cos_sim_spearman
1988
+ value: 82.4018369631579
1989
+ - type: euclidean_pearson
1990
+ value: 81.87673092245366
1991
+ - type: euclidean_spearman
1992
+ value: 82.3671489960678
1993
+ - type: manhattan_pearson
1994
+ value: 81.88222387719948
1995
+ - type: manhattan_spearman
1996
+ value: 82.3816590344736
1997
+ - task:
1998
+ type: STS
1999
+ dataset:
2000
+ type: mteb/sts14-sts
2001
+ name: MTEB STS14
2002
+ config: default
2003
+ split: test
2004
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
2005
+ metrics:
2006
+ - type: cos_sim_pearson
2007
+ value: 81.2836092579524
2008
+ - type: cos_sim_spearman
2009
+ value: 78.99982781772064
2010
+ - type: euclidean_pearson
2011
+ value: 80.5184271010527
2012
+ - type: euclidean_spearman
2013
+ value: 78.89777392101904
2014
+ - type: manhattan_pearson
2015
+ value: 80.53585705018664
2016
+ - type: manhattan_spearman
2017
+ value: 78.92898405472994
2018
+ - task:
2019
+ type: STS
2020
+ dataset:
2021
+ type: mteb/sts15-sts
2022
+ name: MTEB STS15
2023
+ config: default
2024
+ split: test
2025
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
2026
+ metrics:
2027
+ - type: cos_sim_pearson
2028
+ value: 86.7349907750784
2029
+ - type: cos_sim_spearman
2030
+ value: 87.7611234446225
2031
+ - type: euclidean_pearson
2032
+ value: 86.98759326731624
2033
+ - type: euclidean_spearman
2034
+ value: 87.58321319424618
2035
+ - type: manhattan_pearson
2036
+ value: 87.03483090370842
2037
+ - type: manhattan_spearman
2038
+ value: 87.63278333060288
2039
+ - task:
2040
+ type: STS
2041
+ dataset:
2042
+ type: mteb/sts16-sts
2043
+ name: MTEB STS16
2044
+ config: default
2045
+ split: test
2046
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
2047
+ metrics:
2048
+ - type: cos_sim_pearson
2049
+ value: 81.75873694924825
2050
+ - type: cos_sim_spearman
2051
+ value: 83.80237999094724
2052
+ - type: euclidean_pearson
2053
+ value: 83.55023725861537
2054
+ - type: euclidean_spearman
2055
+ value: 84.12744338577744
2056
+ - type: manhattan_pearson
2057
+ value: 83.58816983036232
2058
+ - type: manhattan_spearman
2059
+ value: 84.18520748676501
2060
+ - task:
2061
+ type: STS
2062
+ dataset:
2063
+ type: mteb/sts17-crosslingual-sts
2064
+ name: MTEB STS17 (en-en)
2065
+ config: en-en
2066
+ split: test
2067
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
2068
+ metrics:
2069
+ - type: cos_sim_pearson
2070
+ value: 87.21630882940174
2071
+ - type: cos_sim_spearman
2072
+ value: 87.72382883437031
2073
+ - type: euclidean_pearson
2074
+ value: 88.69933350930333
2075
+ - type: euclidean_spearman
2076
+ value: 88.24660814383081
2077
+ - type: manhattan_pearson
2078
+ value: 88.77331018833499
2079
+ - type: manhattan_spearman
2080
+ value: 88.26109989380632
2081
+ - task:
2082
+ type: STS
2083
+ dataset:
2084
+ type: mteb/sts22-crosslingual-sts
2085
+ name: MTEB STS22 (en)
2086
+ config: en
2087
+ split: test
2088
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2089
+ metrics:
2090
+ - type: cos_sim_pearson
2091
+ value: 61.11854063060489
2092
+ - type: cos_sim_spearman
2093
+ value: 63.14678634195072
2094
+ - type: euclidean_pearson
2095
+ value: 61.679090067000864
2096
+ - type: euclidean_spearman
2097
+ value: 62.28876589509653
2098
+ - type: manhattan_pearson
2099
+ value: 62.082324165511004
2100
+ - type: manhattan_spearman
2101
+ value: 62.56030932816679
2102
+ - task:
2103
+ type: STS
2104
+ dataset:
2105
+ type: mteb/stsbenchmark-sts
2106
+ name: MTEB STSBenchmark
2107
+ config: default
2108
+ split: test
2109
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2110
+ metrics:
2111
+ - type: cos_sim_pearson
2112
+ value: 84.00319882832645
2113
+ - type: cos_sim_spearman
2114
+ value: 85.94529772647257
2115
+ - type: euclidean_pearson
2116
+ value: 85.6661390122756
2117
+ - type: euclidean_spearman
2118
+ value: 85.97747815545827
2119
+ - type: manhattan_pearson
2120
+ value: 85.58422770541893
2121
+ - type: manhattan_spearman
2122
+ value: 85.9237139181532
2123
+ - task:
2124
+ type: Reranking
2125
+ dataset:
2126
+ type: mteb/scidocs-reranking
2127
+ name: MTEB SciDocsRR
2128
+ config: default
2129
+ split: test
2130
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2131
+ metrics:
2132
+ - type: map
2133
+ value: 79.16198731863916
2134
+ - type: mrr
2135
+ value: 94.25202702163487
2136
+ - task:
2137
+ type: Retrieval
2138
+ dataset:
2139
+ type: scifact
2140
+ name: MTEB SciFact
2141
+ config: default
2142
+ split: test
2143
+ revision: None
2144
+ metrics:
2145
+ - type: map_at_1
2146
+ value: 54.761
2147
+ - type: map_at_10
2148
+ value: 64.396
2149
+ - type: map_at_100
2150
+ value: 65.07
2151
+ - type: map_at_1000
2152
+ value: 65.09899999999999
2153
+ - type: map_at_3
2154
+ value: 61.846000000000004
2155
+ - type: map_at_5
2156
+ value: 63.284
2157
+ - type: mrr_at_1
2158
+ value: 57.667
2159
+ - type: mrr_at_10
2160
+ value: 65.83099999999999
2161
+ - type: mrr_at_100
2162
+ value: 66.36800000000001
2163
+ - type: mrr_at_1000
2164
+ value: 66.39399999999999
2165
+ - type: mrr_at_3
2166
+ value: 64.056
2167
+ - type: mrr_at_5
2168
+ value: 65.206
2169
+ - type: ndcg_at_1
2170
+ value: 57.667
2171
+ - type: ndcg_at_10
2172
+ value: 68.854
2173
+ - type: ndcg_at_100
2174
+ value: 71.59100000000001
2175
+ - type: ndcg_at_1000
2176
+ value: 72.383
2177
+ - type: ndcg_at_3
2178
+ value: 64.671
2179
+ - type: ndcg_at_5
2180
+ value: 66.796
2181
+ - type: precision_at_1
2182
+ value: 57.667
2183
+ - type: precision_at_10
2184
+ value: 9.167
2185
+ - type: precision_at_100
2186
+ value: 1.053
2187
+ - type: precision_at_1000
2188
+ value: 0.11199999999999999
2189
+ - type: precision_at_3
2190
+ value: 25.444
2191
+ - type: precision_at_5
2192
+ value: 16.667
2193
+ - type: recall_at_1
2194
+ value: 54.761
2195
+ - type: recall_at_10
2196
+ value: 80.9
2197
+ - type: recall_at_100
2198
+ value: 92.767
2199
+ - type: recall_at_1000
2200
+ value: 99
2201
+ - type: recall_at_3
2202
+ value: 69.672
2203
+ - type: recall_at_5
2204
+ value: 75.083
2205
+ - task:
2206
+ type: PairClassification
2207
+ dataset:
2208
+ type: mteb/sprintduplicatequestions-pairclassification
2209
+ name: MTEB SprintDuplicateQuestions
2210
+ config: default
2211
+ split: test
2212
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2213
+ metrics:
2214
+ - type: cos_sim_accuracy
2215
+ value: 99.8079207920792
2216
+ - type: cos_sim_ap
2217
+ value: 94.88470927617445
2218
+ - type: cos_sim_f1
2219
+ value: 90.08179959100204
2220
+ - type: cos_sim_precision
2221
+ value: 92.15481171548117
2222
+ - type: cos_sim_recall
2223
+ value: 88.1
2224
+ - type: dot_accuracy
2225
+ value: 99.58613861386138
2226
+ - type: dot_ap
2227
+ value: 82.94822578881316
2228
+ - type: dot_f1
2229
+ value: 77.33333333333333
2230
+ - type: dot_precision
2231
+ value: 79.36842105263158
2232
+ - type: dot_recall
2233
+ value: 75.4
2234
+ - type: euclidean_accuracy
2235
+ value: 99.8069306930693
2236
+ - type: euclidean_ap
2237
+ value: 94.81367858031837
2238
+ - type: euclidean_f1
2239
+ value: 90.01009081735621
2240
+ - type: euclidean_precision
2241
+ value: 90.83503054989816
2242
+ - type: euclidean_recall
2243
+ value: 89.2
2244
+ - type: manhattan_accuracy
2245
+ value: 99.81188118811882
2246
+ - type: manhattan_ap
2247
+ value: 94.91405337220161
2248
+ - type: manhattan_f1
2249
+ value: 90.2763561924258
2250
+ - type: manhattan_precision
2251
+ value: 92.45283018867924
2252
+ - type: manhattan_recall
2253
+ value: 88.2
2254
+ - type: max_accuracy
2255
+ value: 99.81188118811882
2256
+ - type: max_ap
2257
+ value: 94.91405337220161
2258
+ - type: max_f1
2259
+ value: 90.2763561924258
2260
+ - task:
2261
+ type: Clustering
2262
+ dataset:
2263
+ type: mteb/stackexchange-clustering
2264
+ name: MTEB StackExchangeClustering
2265
+ config: default
2266
+ split: test
2267
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2268
+ metrics:
2269
+ - type: v_measure
2270
+ value: 58.511599500053094
2271
+ - task:
2272
+ type: Clustering
2273
+ dataset:
2274
+ type: mteb/stackexchange-clustering-p2p
2275
+ name: MTEB StackExchangeClusteringP2P
2276
+ config: default
2277
+ split: test
2278
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2279
+ metrics:
2280
+ - type: v_measure
2281
+ value: 31.984728147814707
2282
+ - task:
2283
+ type: Reranking
2284
+ dataset:
2285
+ type: mteb/stackoverflowdupquestions-reranking
2286
+ name: MTEB StackOverflowDupQuestions
2287
+ config: default
2288
+ split: test
2289
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2290
+ metrics:
2291
+ - type: map
2292
+ value: 49.93428193939015
2293
+ - type: mrr
2294
+ value: 50.916557911043206
2295
+ - task:
2296
+ type: Summarization
2297
+ dataset:
2298
+ type: mteb/summeval
2299
+ name: MTEB SummEval
2300
+ config: default
2301
+ split: test
2302
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2303
+ metrics:
2304
+ - type: cos_sim_pearson
2305
+ value: 31.562500894537145
2306
+ - type: cos_sim_spearman
2307
+ value: 31.162587976726307
2308
+ - type: dot_pearson
2309
+ value: 22.633662187735762
2310
+ - type: dot_spearman
2311
+ value: 22.723000282378962
2312
+ - task:
2313
+ type: Retrieval
2314
+ dataset:
2315
+ type: trec-covid
2316
+ name: MTEB TRECCOVID
2317
+ config: default
2318
+ split: test
2319
+ revision: None
2320
+ metrics:
2321
+ - type: map_at_1
2322
+ value: 0.219
2323
+ - type: map_at_10
2324
+ value: 1.871
2325
+ - type: map_at_100
2326
+ value: 10.487
2327
+ - type: map_at_1000
2328
+ value: 25.122
2329
+ - type: map_at_3
2330
+ value: 0.657
2331
+ - type: map_at_5
2332
+ value: 1.0699999999999998
2333
+ - type: mrr_at_1
2334
+ value: 84
2335
+ - type: mrr_at_10
2336
+ value: 89.567
2337
+ - type: mrr_at_100
2338
+ value: 89.748
2339
+ - type: mrr_at_1000
2340
+ value: 89.748
2341
+ - type: mrr_at_3
2342
+ value: 88.667
2343
+ - type: mrr_at_5
2344
+ value: 89.567
2345
+ - type: ndcg_at_1
2346
+ value: 80
2347
+ - type: ndcg_at_10
2348
+ value: 74.533
2349
+ - type: ndcg_at_100
2350
+ value: 55.839000000000006
2351
+ - type: ndcg_at_1000
2352
+ value: 49.748
2353
+ - type: ndcg_at_3
2354
+ value: 79.53099999999999
2355
+ - type: ndcg_at_5
2356
+ value: 78.245
2357
+ - type: precision_at_1
2358
+ value: 84
2359
+ - type: precision_at_10
2360
+ value: 78.4
2361
+ - type: precision_at_100
2362
+ value: 56.99999999999999
2363
+ - type: precision_at_1000
2364
+ value: 21.98
2365
+ - type: precision_at_3
2366
+ value: 85.333
2367
+ - type: precision_at_5
2368
+ value: 84.8
2369
+ - type: recall_at_1
2370
+ value: 0.219
2371
+ - type: recall_at_10
2372
+ value: 2.02
2373
+ - type: recall_at_100
2374
+ value: 13.555
2375
+ - type: recall_at_1000
2376
+ value: 46.739999999999995
2377
+ - type: recall_at_3
2378
+ value: 0.685
2379
+ - type: recall_at_5
2380
+ value: 1.13
2381
+ - task:
2382
+ type: Retrieval
2383
+ dataset:
2384
+ type: webis-touche2020
2385
+ name: MTEB Touche2020
2386
+ config: default
2387
+ split: test
2388
+ revision: None
2389
+ metrics:
2390
+ - type: map_at_1
2391
+ value: 3.5029999999999997
2392
+ - type: map_at_10
2393
+ value: 11.042
2394
+ - type: map_at_100
2395
+ value: 16.326999999999998
2396
+ - type: map_at_1000
2397
+ value: 17.836
2398
+ - type: map_at_3
2399
+ value: 6.174
2400
+ - type: map_at_5
2401
+ value: 7.979
2402
+ - type: mrr_at_1
2403
+ value: 42.857
2404
+ - type: mrr_at_10
2405
+ value: 52.617000000000004
2406
+ - type: mrr_at_100
2407
+ value: 53.351000000000006
2408
+ - type: mrr_at_1000
2409
+ value: 53.351000000000006
2410
+ - type: mrr_at_3
2411
+ value: 46.939
2412
+ - type: mrr_at_5
2413
+ value: 50.714000000000006
2414
+ - type: ndcg_at_1
2415
+ value: 38.775999999999996
2416
+ - type: ndcg_at_10
2417
+ value: 27.125
2418
+ - type: ndcg_at_100
2419
+ value: 35.845
2420
+ - type: ndcg_at_1000
2421
+ value: 47.377
2422
+ - type: ndcg_at_3
2423
+ value: 29.633
2424
+ - type: ndcg_at_5
2425
+ value: 28.378999999999998
2426
+ - type: precision_at_1
2427
+ value: 42.857
2428
+ - type: precision_at_10
2429
+ value: 24.082
2430
+ - type: precision_at_100
2431
+ value: 6.877999999999999
2432
+ - type: precision_at_1000
2433
+ value: 1.463
2434
+ - type: precision_at_3
2435
+ value: 29.932
2436
+ - type: precision_at_5
2437
+ value: 28.571
2438
+ - type: recall_at_1
2439
+ value: 3.5029999999999997
2440
+ - type: recall_at_10
2441
+ value: 17.068
2442
+ - type: recall_at_100
2443
+ value: 43.361
2444
+ - type: recall_at_1000
2445
+ value: 78.835
2446
+ - type: recall_at_3
2447
+ value: 6.821000000000001
2448
+ - type: recall_at_5
2449
+ value: 10.357
2450
+ - task:
2451
+ type: Classification
2452
+ dataset:
2453
+ type: mteb/toxic_conversations_50k
2454
+ name: MTEB ToxicConversationsClassification
2455
+ config: default
2456
+ split: test
2457
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2458
+ metrics:
2459
+ - type: accuracy
2460
+ value: 71.0954
2461
+ - type: ap
2462
+ value: 14.216844153511959
2463
+ - type: f1
2464
+ value: 54.63687418565117
2465
+ - task:
2466
+ type: Classification
2467
+ dataset:
2468
+ type: mteb/tweet_sentiment_extraction
2469
+ name: MTEB TweetSentimentExtractionClassification
2470
+ config: default
2471
+ split: test
2472
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2473
+ metrics:
2474
+ - type: accuracy
2475
+ value: 61.46293152235427
2476
+ - type: f1
2477
+ value: 61.744177921638645
2478
+ - task:
2479
+ type: Clustering
2480
+ dataset:
2481
+ type: mteb/twentynewsgroups-clustering
2482
+ name: MTEB TwentyNewsgroupsClustering
2483
+ config: default
2484
+ split: test
2485
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2486
+ metrics:
2487
+ - type: v_measure
2488
+ value: 41.12708617788644
2489
+ - task:
2490
+ type: PairClassification
2491
+ dataset:
2492
+ type: mteb/twittersemeval2015-pairclassification
2493
+ name: MTEB TwitterSemEval2015
2494
+ config: default
2495
+ split: test
2496
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2497
+ metrics:
2498
+ - type: cos_sim_accuracy
2499
+ value: 85.75430649102938
2500
+ - type: cos_sim_ap
2501
+ value: 73.34252536948081
2502
+ - type: cos_sim_f1
2503
+ value: 67.53758935173774
2504
+ - type: cos_sim_precision
2505
+ value: 63.3672525439408
2506
+ - type: cos_sim_recall
2507
+ value: 72.29551451187335
2508
+ - type: dot_accuracy
2509
+ value: 81.71305954580676
2510
+ - type: dot_ap
2511
+ value: 59.5532209082386
2512
+ - type: dot_f1
2513
+ value: 56.18466898954705
2514
+ - type: dot_precision
2515
+ value: 47.830923248053395
2516
+ - type: dot_recall
2517
+ value: 68.07387862796834
2518
+ - type: euclidean_accuracy
2519
+ value: 85.81987244441795
2520
+ - type: euclidean_ap
2521
+ value: 73.34325409809446
2522
+ - type: euclidean_f1
2523
+ value: 67.83451360417443
2524
+ - type: euclidean_precision
2525
+ value: 64.09955388588871
2526
+ - type: euclidean_recall
2527
+ value: 72.0316622691293
2528
+ - type: manhattan_accuracy
2529
+ value: 85.68277999642368
2530
+ - type: manhattan_ap
2531
+ value: 73.1535450121903
2532
+ - type: manhattan_f1
2533
+ value: 67.928237896289
2534
+ - type: manhattan_precision
2535
+ value: 63.56945722171113
2536
+ - type: manhattan_recall
2537
+ value: 72.9287598944591
2538
+ - type: max_accuracy
2539
+ value: 85.81987244441795
2540
+ - type: max_ap
2541
+ value: 73.34325409809446
2542
+ - type: max_f1
2543
+ value: 67.928237896289
2544
+ - task:
2545
+ type: PairClassification
2546
+ dataset:
2547
+ type: mteb/twitterurlcorpus-pairclassification
2548
+ name: MTEB TwitterURLCorpus
2549
+ config: default
2550
+ split: test
2551
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2552
+ metrics:
2553
+ - type: cos_sim_accuracy
2554
+ value: 88.90441262079403
2555
+ - type: cos_sim_ap
2556
+ value: 85.79331880741438
2557
+ - type: cos_sim_f1
2558
+ value: 78.31563529842548
2559
+ - type: cos_sim_precision
2560
+ value: 74.6683424102779
2561
+ - type: cos_sim_recall
2562
+ value: 82.33754234678165
2563
+ - type: dot_accuracy
2564
+ value: 84.89928978926534
2565
+ - type: dot_ap
2566
+ value: 75.25819218316
2567
+ - type: dot_f1
2568
+ value: 69.88730119720536
2569
+ - type: dot_precision
2570
+ value: 64.23362374959665
2571
+ - type: dot_recall
2572
+ value: 76.63227594702803
2573
+ - type: euclidean_accuracy
2574
+ value: 89.01695967710637
2575
+ - type: euclidean_ap
2576
+ value: 85.98986606038852
2577
+ - type: euclidean_f1
2578
+ value: 78.5277880014722
2579
+ - type: euclidean_precision
2580
+ value: 75.22211253701876
2581
+ - type: euclidean_recall
2582
+ value: 82.13735756082538
2583
+ - type: manhattan_accuracy
2584
+ value: 88.99561454573679
2585
+ - type: manhattan_ap
2586
+ value: 85.92262421793953
2587
+ - type: manhattan_f1
2588
+ value: 78.38866094740769
2589
+ - type: manhattan_precision
2590
+ value: 76.02373028505282
2591
+ - type: manhattan_recall
2592
+ value: 80.9054511857099
2593
+ - type: max_accuracy
2594
+ value: 89.01695967710637
2595
+ - type: max_ap
2596
+ value: 85.98986606038852
2597
+ - type: max_f1
2598
+ value: 78.5277880014722
2599
+ language:
2600
+ - en
2601
+ license: mit
2602
+ ---
2603
+
2604
+ # E5-small-v2
2605
+
2606
+ [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
2607
+ Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
2608
+
2609
+ This model has 12 layers and the embedding size is 384.
2610
+
2611
+ ## Usage
2612
+
2613
+ Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
2614
+
2615
+ ```python
2616
+ import torch.nn.functional as F
2617
+
2618
+ from torch import Tensor
2619
+ from transformers import AutoTokenizer, AutoModel
2620
+
2621
+
2622
+ def average_pool(last_hidden_states: Tensor,
2623
+ attention_mask: Tensor) -> Tensor:
2624
+ last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
2625
+ return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
2626
+
2627
+
2628
+ # Each input text should start with "query: " or "passage: ".
2629
+ # For tasks other than retrieval, you can simply use the "query: " prefix.
2630
+ input_texts = ['query: how much protein should a female eat',
2631
+ 'query: summit define',
2632
+ "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
2633
+ "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
2634
+
2635
+ tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-small-v2')
2636
+ model = AutoModel.from_pretrained('intfloat/e5-small-v2')
2637
+
2638
+ # Tokenize the input texts
2639
+ batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
2640
+
2641
+ outputs = model(**batch_dict)
2642
+ embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
2643
+
2644
+ # (Optionally) normalize embeddings
2645
+ embeddings = F.normalize(embeddings, p=2, dim=1)
2646
+ scores = (embeddings[:2] @ embeddings[2:].T) * 100
2647
+ print(scores.tolist())
2648
+ ```
2649
+
2650
+ ## Training Details
2651
+
2652
+ Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
2653
+
2654
+ ## Benchmark Evaluation
2655
+
2656
+ Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
2657
+ on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
2658
+
2659
+ ## Citation
2660
+
2661
+ If you find our paper or models helpful, please consider cite as follows:
2662
+
2663
+ ```
2664
+ @article{wang2022text,
2665
+ title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
2666
+ author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
2667
+ journal={arXiv preprint arXiv:2212.03533},
2668
+ year={2022}
2669
+ }
2670
+ ```
2671
+
2672
+ ## Limitations
2673
+
2674
+ This model only works for English texts. Long texts will be truncated to at most 512 tokens.
2675
+
2676
+ ## Sentence Transformers
2677
+
2678
+ Below is an example for usage with sentence_transformers. `pip install sentence_transformers~=2.2.2`
2679
+ This is community contributed, and results may vary up to numerical precision.
2680
+ ```python
2681
+ from sentence_transformers import SentenceTransformer
2682
+ model = SentenceTransformer('intfloat/e5-small-v2')
2683
+ embeddings = model.encode(input_texts, normalize_embeddings=True)
2684
+ ```
config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "eos_token": "</s>",
4
+ "layer_norm_epsilon": 1e-12,
5
+ "unk_token": "[UNK]"
6
+ }
model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5e879493b1a8969c853c56602b6f4233207663263a83b1a881cc0b35646df17a
3
+ size 34433870
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "clean_up_tokenization_spaces": true,
3
+ "cls_token": "[CLS]",
4
+ "do_basic_tokenize": true,
5
+ "do_lower_case": true,
6
+ "mask_token": "[MASK]",
7
+ "model_max_length": 1000000000000000019884624838656,
8
+ "never_split": null,
9
+ "pad_token": "[PAD]",
10
+ "sep_token": "[SEP]",
11
+ "strip_accents": null,
12
+ "tokenize_chinese_chars": true,
13
+ "tokenizer_class": "BertTokenizer",
14
+ "unk_token": "[UNK]"
15
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff
 
vocabulary.json ADDED
The diff for this file is too large to render. See raw diff