thtang commited on
Commit
d52fbe0
·
1 Parent(s): c381c3e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2510 -6
README.md CHANGED
@@ -1,11 +1,2515 @@
1
  ---
2
- pipeline_tag: sentence-similarity
3
  tags:
4
- - sentence-transformers
5
- - feature-extraction
6
- - sentence-similarity
7
- - transformers
8
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  ---
10
 
11
  # {MODEL_NAME}
 
1
  ---
 
2
  tags:
3
+ - mteb
4
+ model-index:
5
+ - name: ALL_862873
6
+ results:
7
+ - task:
8
+ type: Classification
9
+ dataset:
10
+ type: mteb/amazon_counterfactual
11
+ name: MTEB AmazonCounterfactualClassification (en)
12
+ config: en
13
+ split: test
14
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
15
+ metrics:
16
+ - type: accuracy
17
+ value: 50.805970149253746
18
+ - type: ap
19
+ value: 21.350961103104364
20
+ - type: f1
21
+ value: 46.546166439875044
22
+ - task:
23
+ type: Classification
24
+ dataset:
25
+ type: mteb/amazon_polarity
26
+ name: MTEB AmazonPolarityClassification
27
+ config: default
28
+ split: test
29
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
30
+ metrics:
31
+ - type: accuracy
32
+ value: 52.567125000000004
33
+ - type: ap
34
+ value: 51.37893936391345
35
+ - type: f1
36
+ value: 51.8411977908125
37
+ - task:
38
+ type: Classification
39
+ dataset:
40
+ type: mteb/amazon_reviews_multi
41
+ name: MTEB AmazonReviewsClassification (en)
42
+ config: en
43
+ split: test
44
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
45
+ metrics:
46
+ - type: accuracy
47
+ value: 22.63
48
+ - type: f1
49
+ value: 21.964526516204575
50
+ - task:
51
+ type: Retrieval
52
+ dataset:
53
+ type: arguana
54
+ name: MTEB ArguAna
55
+ config: default
56
+ split: test
57
+ revision: None
58
+ metrics:
59
+ - type: map_at_1
60
+ value: 1.991
61
+ - type: map_at_10
62
+ value: 4.095
63
+ - type: map_at_100
64
+ value: 4.763
65
+ - type: map_at_1000
66
+ value: 4.8759999999999994
67
+ - type: map_at_3
68
+ value: 3.3070000000000004
69
+ - type: map_at_5
70
+ value: 3.73
71
+ - type: mrr_at_1
72
+ value: 2.0629999999999997
73
+ - type: mrr_at_10
74
+ value: 4.119
75
+ - type: mrr_at_100
76
+ value: 4.787
77
+ - type: mrr_at_1000
78
+ value: 4.9
79
+ - type: mrr_at_3
80
+ value: 3.331
81
+ - type: mrr_at_5
82
+ value: 3.768
83
+ - type: ndcg_at_1
84
+ value: 1.991
85
+ - type: ndcg_at_10
86
+ value: 5.346
87
+ - type: ndcg_at_100
88
+ value: 9.181000000000001
89
+ - type: ndcg_at_1000
90
+ value: 13.004
91
+ - type: ndcg_at_3
92
+ value: 3.7199999999999998
93
+ - type: ndcg_at_5
94
+ value: 4.482
95
+ - type: precision_at_1
96
+ value: 1.991
97
+ - type: precision_at_10
98
+ value: 0.9390000000000001
99
+ - type: precision_at_100
100
+ value: 0.28700000000000003
101
+ - type: precision_at_1000
102
+ value: 0.061
103
+ - type: precision_at_3
104
+ value: 1.636
105
+ - type: precision_at_5
106
+ value: 1.351
107
+ - type: recall_at_1
108
+ value: 1.991
109
+ - type: recall_at_10
110
+ value: 9.388
111
+ - type: recall_at_100
112
+ value: 28.663
113
+ - type: recall_at_1000
114
+ value: 60.597
115
+ - type: recall_at_3
116
+ value: 4.9079999999999995
117
+ - type: recall_at_5
118
+ value: 6.757000000000001
119
+ - task:
120
+ type: Clustering
121
+ dataset:
122
+ type: mteb/arxiv-clustering-p2p
123
+ name: MTEB ArxivClusteringP2P
124
+ config: default
125
+ split: test
126
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
127
+ metrics:
128
+ - type: v_measure
129
+ value: 14.790995349964428
130
+ - task:
131
+ type: Clustering
132
+ dataset:
133
+ type: mteb/arxiv-clustering-s2s
134
+ name: MTEB ArxivClusteringS2S
135
+ config: default
136
+ split: test
137
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
138
+ metrics:
139
+ - type: v_measure
140
+ value: 12.248406292959412
141
+ - task:
142
+ type: Reranking
143
+ dataset:
144
+ type: mteb/askubuntudupquestions-reranking
145
+ name: MTEB AskUbuntuDupQuestions
146
+ config: default
147
+ split: test
148
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
149
+ metrics:
150
+ - type: map
151
+ value: 44.88116875696166
152
+ - type: mrr
153
+ value: 56.07439651760981
154
+ - task:
155
+ type: STS
156
+ dataset:
157
+ type: mteb/biosses-sts
158
+ name: MTEB BIOSSES
159
+ config: default
160
+ split: test
161
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
162
+ metrics:
163
+ - type: cos_sim_pearson
164
+ value: 19.26573437410263
165
+ - type: cos_sim_spearman
166
+ value: 21.34145013484056
167
+ - type: euclidean_pearson
168
+ value: 22.39226418475093
169
+ - type: euclidean_spearman
170
+ value: 23.511981519581447
171
+ - type: manhattan_pearson
172
+ value: 22.14346931904813
173
+ - type: manhattan_spearman
174
+ value: 23.39390654000631
175
+ - task:
176
+ type: Classification
177
+ dataset:
178
+ type: mteb/banking77
179
+ name: MTEB Banking77Classification
180
+ config: default
181
+ split: test
182
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
183
+ metrics:
184
+ - type: accuracy
185
+ value: 36.42857142857143
186
+ - type: f1
187
+ value: 34.81640976406094
188
+ - task:
189
+ type: Clustering
190
+ dataset:
191
+ type: mteb/biorxiv-clustering-p2p
192
+ name: MTEB BiorxivClusteringP2P
193
+ config: default
194
+ split: test
195
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
196
+ metrics:
197
+ - type: v_measure
198
+ value: 13.94296328377691
199
+ - task:
200
+ type: Clustering
201
+ dataset:
202
+ type: mteb/biorxiv-clustering-s2s
203
+ name: MTEB BiorxivClusteringS2S
204
+ config: default
205
+ split: test
206
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
207
+ metrics:
208
+ - type: v_measure
209
+ value: 9.790764523161606
210
+ - task:
211
+ type: Retrieval
212
+ dataset:
213
+ type: BeIR/cqadupstack
214
+ name: MTEB CQADupstackAndroidRetrieval
215
+ config: default
216
+ split: test
217
+ revision: None
218
+ metrics:
219
+ - type: map_at_1
220
+ value: 0.968
221
+ - type: map_at_10
222
+ value: 2.106
223
+ - type: map_at_100
224
+ value: 2.411
225
+ - type: map_at_1000
226
+ value: 2.4899999999999998
227
+ - type: map_at_3
228
+ value: 1.797
229
+ - type: map_at_5
230
+ value: 1.9959999999999998
231
+ - type: mrr_at_1
232
+ value: 1.717
233
+ - type: mrr_at_10
234
+ value: 3.0349999999999997
235
+ - type: mrr_at_100
236
+ value: 3.4029999999999996
237
+ - type: mrr_at_1000
238
+ value: 3.486
239
+ - type: mrr_at_3
240
+ value: 2.6470000000000002
241
+ - type: mrr_at_5
242
+ value: 2.876
243
+ - type: ndcg_at_1
244
+ value: 1.717
245
+ - type: ndcg_at_10
246
+ value: 2.9059999999999997
247
+ - type: ndcg_at_100
248
+ value: 4.715
249
+ - type: ndcg_at_1000
250
+ value: 7.318
251
+ - type: ndcg_at_3
252
+ value: 2.415
253
+ - type: ndcg_at_5
254
+ value: 2.682
255
+ - type: precision_at_1
256
+ value: 1.717
257
+ - type: precision_at_10
258
+ value: 0.658
259
+ - type: precision_at_100
260
+ value: 0.197
261
+ - type: precision_at_1000
262
+ value: 0.054
263
+ - type: precision_at_3
264
+ value: 1.431
265
+ - type: precision_at_5
266
+ value: 1.059
267
+ - type: recall_at_1
268
+ value: 0.968
269
+ - type: recall_at_10
270
+ value: 4.531000000000001
271
+ - type: recall_at_100
272
+ value: 13.081000000000001
273
+ - type: recall_at_1000
274
+ value: 32.443
275
+ - type: recall_at_3
276
+ value: 2.8850000000000002
277
+ - type: recall_at_5
278
+ value: 3.768
279
+ - task:
280
+ type: Retrieval
281
+ dataset:
282
+ type: BeIR/cqadupstack
283
+ name: MTEB CQADupstackEnglishRetrieval
284
+ config: default
285
+ split: test
286
+ revision: None
287
+ metrics:
288
+ - type: map_at_1
289
+ value: 0.9390000000000001
290
+ - type: map_at_10
291
+ value: 1.516
292
+ - type: map_at_100
293
+ value: 1.6680000000000001
294
+ - type: map_at_1000
295
+ value: 1.701
296
+ - type: map_at_3
297
+ value: 1.314
298
+ - type: map_at_5
299
+ value: 1.388
300
+ - type: mrr_at_1
301
+ value: 1.146
302
+ - type: mrr_at_10
303
+ value: 1.96
304
+ - type: mrr_at_100
305
+ value: 2.166
306
+ - type: mrr_at_1000
307
+ value: 2.207
308
+ - type: mrr_at_3
309
+ value: 1.72
310
+ - type: mrr_at_5
311
+ value: 1.796
312
+ - type: ndcg_at_1
313
+ value: 1.146
314
+ - type: ndcg_at_10
315
+ value: 1.9769999999999999
316
+ - type: ndcg_at_100
317
+ value: 2.8400000000000003
318
+ - type: ndcg_at_1000
319
+ value: 4.035
320
+ - type: ndcg_at_3
321
+ value: 1.5859999999999999
322
+ - type: ndcg_at_5
323
+ value: 1.6709999999999998
324
+ - type: precision_at_1
325
+ value: 1.146
326
+ - type: precision_at_10
327
+ value: 0.43299999999999994
328
+ - type: precision_at_100
329
+ value: 0.11100000000000002
330
+ - type: precision_at_1000
331
+ value: 0.027999999999999997
332
+ - type: precision_at_3
333
+ value: 0.8699999999999999
334
+ - type: precision_at_5
335
+ value: 0.611
336
+ - type: recall_at_1
337
+ value: 0.9390000000000001
338
+ - type: recall_at_10
339
+ value: 2.949
340
+ - type: recall_at_100
341
+ value: 6.737
342
+ - type: recall_at_1000
343
+ value: 15.604999999999999
344
+ - type: recall_at_3
345
+ value: 1.846
346
+ - type: recall_at_5
347
+ value: 2.08
348
+ - task:
349
+ type: Retrieval
350
+ dataset:
351
+ type: BeIR/cqadupstack
352
+ name: MTEB CQADupstackGamingRetrieval
353
+ config: default
354
+ split: test
355
+ revision: None
356
+ metrics:
357
+ - type: map_at_1
358
+ value: 1.28
359
+ - type: map_at_10
360
+ value: 2.157
361
+ - type: map_at_100
362
+ value: 2.401
363
+ - type: map_at_1000
364
+ value: 2.4570000000000003
365
+ - type: map_at_3
366
+ value: 1.865
367
+ - type: map_at_5
368
+ value: 1.928
369
+ - type: mrr_at_1
370
+ value: 1.505
371
+ - type: mrr_at_10
372
+ value: 2.52
373
+ - type: mrr_at_100
374
+ value: 2.782
375
+ - type: mrr_at_1000
376
+ value: 2.8400000000000003
377
+ - type: mrr_at_3
378
+ value: 2.1839999999999997
379
+ - type: mrr_at_5
380
+ value: 2.2689999999999997
381
+ - type: ndcg_at_1
382
+ value: 1.505
383
+ - type: ndcg_at_10
384
+ value: 2.798
385
+ - type: ndcg_at_100
386
+ value: 4.2090000000000005
387
+ - type: ndcg_at_1000
388
+ value: 6.105
389
+ - type: ndcg_at_3
390
+ value: 2.157
391
+ - type: ndcg_at_5
392
+ value: 2.258
393
+ - type: precision_at_1
394
+ value: 1.505
395
+ - type: precision_at_10
396
+ value: 0.5519999999999999
397
+ - type: precision_at_100
398
+ value: 0.146
399
+ - type: precision_at_1000
400
+ value: 0.034999999999999996
401
+ - type: precision_at_3
402
+ value: 1.024
403
+ - type: precision_at_5
404
+ value: 0.7020000000000001
405
+ - type: recall_at_1
406
+ value: 1.28
407
+ - type: recall_at_10
408
+ value: 4.455
409
+ - type: recall_at_100
410
+ value: 11.169
411
+ - type: recall_at_1000
412
+ value: 26.046000000000003
413
+ - type: recall_at_3
414
+ value: 2.6270000000000002
415
+ - type: recall_at_5
416
+ value: 2.899
417
+ - task:
418
+ type: Retrieval
419
+ dataset:
420
+ type: BeIR/cqadupstack
421
+ name: MTEB CQADupstackGisRetrieval
422
+ config: default
423
+ split: test
424
+ revision: None
425
+ metrics:
426
+ - type: map_at_1
427
+ value: 0.264
428
+ - type: map_at_10
429
+ value: 0.615
430
+ - type: map_at_100
431
+ value: 0.76
432
+ - type: map_at_1000
433
+ value: 0.803
434
+ - type: map_at_3
435
+ value: 0.40499999999999997
436
+ - type: map_at_5
437
+ value: 0.512
438
+ - type: mrr_at_1
439
+ value: 0.33899999999999997
440
+ - type: mrr_at_10
441
+ value: 0.718
442
+ - type: mrr_at_100
443
+ value: 0.8880000000000001
444
+ - type: mrr_at_1000
445
+ value: 0.935
446
+ - type: mrr_at_3
447
+ value: 0.508
448
+ - type: mrr_at_5
449
+ value: 0.616
450
+ - type: ndcg_at_1
451
+ value: 0.33899999999999997
452
+ - type: ndcg_at_10
453
+ value: 0.9079999999999999
454
+ - type: ndcg_at_100
455
+ value: 1.9029999999999998
456
+ - type: ndcg_at_1000
457
+ value: 3.4939999999999998
458
+ - type: ndcg_at_3
459
+ value: 0.46499999999999997
460
+ - type: ndcg_at_5
461
+ value: 0.655
462
+ - type: precision_at_1
463
+ value: 0.33899999999999997
464
+ - type: precision_at_10
465
+ value: 0.192
466
+ - type: precision_at_100
467
+ value: 0.079
468
+ - type: precision_at_1000
469
+ value: 0.023
470
+ - type: precision_at_3
471
+ value: 0.22599999999999998
472
+ - type: precision_at_5
473
+ value: 0.22599999999999998
474
+ - type: recall_at_1
475
+ value: 0.264
476
+ - type: recall_at_10
477
+ value: 1.789
478
+ - type: recall_at_100
479
+ value: 6.927
480
+ - type: recall_at_1000
481
+ value: 19.922
482
+ - type: recall_at_3
483
+ value: 0.5459999999999999
484
+ - type: recall_at_5
485
+ value: 0.9979999999999999
486
+ - task:
487
+ type: Retrieval
488
+ dataset:
489
+ type: BeIR/cqadupstack
490
+ name: MTEB CQADupstackMathematicaRetrieval
491
+ config: default
492
+ split: test
493
+ revision: None
494
+ metrics:
495
+ - type: map_at_1
496
+ value: 0.5599999999999999
497
+ - type: map_at_10
498
+ value: 0.9129999999999999
499
+ - type: map_at_100
500
+ value: 1.027
501
+ - type: map_at_1000
502
+ value: 1.072
503
+ - type: map_at_3
504
+ value: 0.715
505
+ - type: map_at_5
506
+ value: 0.826
507
+ - type: mrr_at_1
508
+ value: 0.8710000000000001
509
+ - type: mrr_at_10
510
+ value: 1.331
511
+ - type: mrr_at_100
512
+ value: 1.494
513
+ - type: mrr_at_1000
514
+ value: 1.547
515
+ - type: mrr_at_3
516
+ value: 1.119
517
+ - type: mrr_at_5
518
+ value: 1.269
519
+ - type: ndcg_at_1
520
+ value: 0.8710000000000001
521
+ - type: ndcg_at_10
522
+ value: 1.2590000000000001
523
+ - type: ndcg_at_100
524
+ value: 2.023
525
+ - type: ndcg_at_1000
526
+ value: 3.737
527
+ - type: ndcg_at_3
528
+ value: 0.8750000000000001
529
+ - type: ndcg_at_5
530
+ value: 1.079
531
+ - type: precision_at_1
532
+ value: 0.8710000000000001
533
+ - type: precision_at_10
534
+ value: 0.28600000000000003
535
+ - type: precision_at_100
536
+ value: 0.086
537
+ - type: precision_at_1000
538
+ value: 0.027999999999999997
539
+ - type: precision_at_3
540
+ value: 0.498
541
+ - type: precision_at_5
542
+ value: 0.42300000000000004
543
+ - type: recall_at_1
544
+ value: 0.5599999999999999
545
+ - type: recall_at_10
546
+ value: 1.907
547
+ - type: recall_at_100
548
+ value: 5.492
549
+ - type: recall_at_1000
550
+ value: 18.974
551
+ - type: recall_at_3
552
+ value: 0.943
553
+ - type: recall_at_5
554
+ value: 1.41
555
+ - task:
556
+ type: Retrieval
557
+ dataset:
558
+ type: BeIR/cqadupstack
559
+ name: MTEB CQADupstackPhysicsRetrieval
560
+ config: default
561
+ split: test
562
+ revision: None
563
+ metrics:
564
+ - type: map_at_1
565
+ value: 1.9720000000000002
566
+ - type: map_at_10
567
+ value: 2.968
568
+ - type: map_at_100
569
+ value: 3.2009999999999996
570
+ - type: map_at_1000
571
+ value: 3.2680000000000002
572
+ - type: map_at_3
573
+ value: 2.683
574
+ - type: map_at_5
575
+ value: 2.8369999999999997
576
+ - type: mrr_at_1
577
+ value: 2.406
578
+ - type: mrr_at_10
579
+ value: 3.567
580
+ - type: mrr_at_100
581
+ value: 3.884
582
+ - type: mrr_at_1000
583
+ value: 3.948
584
+ - type: mrr_at_3
585
+ value: 3.2239999999999998
586
+ - type: mrr_at_5
587
+ value: 3.383
588
+ - type: ndcg_at_1
589
+ value: 2.406
590
+ - type: ndcg_at_10
591
+ value: 3.63
592
+ - type: ndcg_at_100
593
+ value: 5.155
594
+ - type: ndcg_at_1000
595
+ value: 7.381
596
+ - type: ndcg_at_3
597
+ value: 3.078
598
+ - type: ndcg_at_5
599
+ value: 3.3070000000000004
600
+ - type: precision_at_1
601
+ value: 2.406
602
+ - type: precision_at_10
603
+ value: 0.635
604
+ - type: precision_at_100
605
+ value: 0.184
606
+ - type: precision_at_1000
607
+ value: 0.048
608
+ - type: precision_at_3
609
+ value: 1.4120000000000001
610
+ - type: precision_at_5
611
+ value: 1.001
612
+ - type: recall_at_1
613
+ value: 1.9720000000000002
614
+ - type: recall_at_10
615
+ value: 5.152
616
+ - type: recall_at_100
617
+ value: 12.173
618
+ - type: recall_at_1000
619
+ value: 28.811999999999998
620
+ - type: recall_at_3
621
+ value: 3.556
622
+ - type: recall_at_5
623
+ value: 4.181
624
+ - task:
625
+ type: Retrieval
626
+ dataset:
627
+ type: BeIR/cqadupstack
628
+ name: MTEB CQADupstackProgrammersRetrieval
629
+ config: default
630
+ split: test
631
+ revision: None
632
+ metrics:
633
+ - type: map_at_1
634
+ value: 0.346
635
+ - type: map_at_10
636
+ value: 0.619
637
+ - type: map_at_100
638
+ value: 0.743
639
+ - type: map_at_1000
640
+ value: 0.788
641
+ - type: map_at_3
642
+ value: 0.5369999999999999
643
+ - type: map_at_5
644
+ value: 0.551
645
+ - type: mrr_at_1
646
+ value: 0.571
647
+ - type: mrr_at_10
648
+ value: 1.0619999999999998
649
+ - type: mrr_at_100
650
+ value: 1.2109999999999999
651
+ - type: mrr_at_1000
652
+ value: 1.265
653
+ - type: mrr_at_3
654
+ value: 0.818
655
+ - type: mrr_at_5
656
+ value: 0.927
657
+ - type: ndcg_at_1
658
+ value: 0.571
659
+ - type: ndcg_at_10
660
+ value: 0.919
661
+ - type: ndcg_at_100
662
+ value: 1.688
663
+ - type: ndcg_at_1000
664
+ value: 3.3649999999999998
665
+ - type: ndcg_at_3
666
+ value: 0.6779999999999999
667
+ - type: ndcg_at_5
668
+ value: 0.7230000000000001
669
+ - type: precision_at_1
670
+ value: 0.571
671
+ - type: precision_at_10
672
+ value: 0.27399999999999997
673
+ - type: precision_at_100
674
+ value: 0.084
675
+ - type: precision_at_1000
676
+ value: 0.029
677
+ - type: precision_at_3
678
+ value: 0.381
679
+ - type: precision_at_5
680
+ value: 0.32
681
+ - type: recall_at_1
682
+ value: 0.346
683
+ - type: recall_at_10
684
+ value: 1.397
685
+ - type: recall_at_100
686
+ value: 5.079000000000001
687
+ - type: recall_at_1000
688
+ value: 18.060000000000002
689
+ - type: recall_at_3
690
+ value: 0.774
691
+ - type: recall_at_5
692
+ value: 0.8340000000000001
693
+ - task:
694
+ type: Retrieval
695
+ dataset:
696
+ type: BeIR/cqadupstack
697
+ name: MTEB CQADupstackStatsRetrieval
698
+ config: default
699
+ split: test
700
+ revision: None
701
+ metrics:
702
+ - type: map_at_1
703
+ value: 0.69
704
+ - type: map_at_10
705
+ value: 0.897
706
+ - type: map_at_100
707
+ value: 1.0030000000000001
708
+ - type: map_at_1000
709
+ value: 1.034
710
+ - type: map_at_3
711
+ value: 0.818
712
+ - type: map_at_5
713
+ value: 0.864
714
+ - type: mrr_at_1
715
+ value: 0.767
716
+ - type: mrr_at_10
717
+ value: 1.008
718
+ - type: mrr_at_100
719
+ value: 1.145
720
+ - type: mrr_at_1000
721
+ value: 1.183
722
+ - type: mrr_at_3
723
+ value: 0.895
724
+ - type: mrr_at_5
725
+ value: 0.9560000000000001
726
+ - type: ndcg_at_1
727
+ value: 0.767
728
+ - type: ndcg_at_10
729
+ value: 1.0739999999999998
730
+ - type: ndcg_at_100
731
+ value: 1.757
732
+ - type: ndcg_at_1000
733
+ value: 2.9090000000000003
734
+ - type: ndcg_at_3
735
+ value: 0.881
736
+ - type: ndcg_at_5
737
+ value: 0.9769999999999999
738
+ - type: precision_at_1
739
+ value: 0.767
740
+ - type: precision_at_10
741
+ value: 0.184
742
+ - type: precision_at_100
743
+ value: 0.06
744
+ - type: precision_at_1000
745
+ value: 0.018000000000000002
746
+ - type: precision_at_3
747
+ value: 0.358
748
+ - type: precision_at_5
749
+ value: 0.27599999999999997
750
+ - type: recall_at_1
751
+ value: 0.69
752
+ - type: recall_at_10
753
+ value: 1.508
754
+ - type: recall_at_100
755
+ value: 4.858
756
+ - type: recall_at_1000
757
+ value: 14.007
758
+ - type: recall_at_3
759
+ value: 0.997
760
+ - type: recall_at_5
761
+ value: 1.2269999999999999
762
+ - task:
763
+ type: Retrieval
764
+ dataset:
765
+ type: BeIR/cqadupstack
766
+ name: MTEB CQADupstackTexRetrieval
767
+ config: default
768
+ split: test
769
+ revision: None
770
+ metrics:
771
+ - type: map_at_1
772
+ value: 0.338
773
+ - type: map_at_10
774
+ value: 0.661
775
+ - type: map_at_100
776
+ value: 0.7969999999999999
777
+ - type: map_at_1000
778
+ value: 0.8290000000000001
779
+ - type: map_at_3
780
+ value: 0.5559999999999999
781
+ - type: map_at_5
782
+ value: 0.5910000000000001
783
+ - type: mrr_at_1
784
+ value: 0.482
785
+ - type: mrr_at_10
786
+ value: 0.88
787
+ - type: mrr_at_100
788
+ value: 1.036
789
+ - type: mrr_at_1000
790
+ value: 1.075
791
+ - type: mrr_at_3
792
+ value: 0.74
793
+ - type: mrr_at_5
794
+ value: 0.779
795
+ - type: ndcg_at_1
796
+ value: 0.482
797
+ - type: ndcg_at_10
798
+ value: 0.924
799
+ - type: ndcg_at_100
800
+ value: 1.736
801
+ - type: ndcg_at_1000
802
+ value: 2.926
803
+ - type: ndcg_at_3
804
+ value: 0.677
805
+ - type: ndcg_at_5
806
+ value: 0.732
807
+ - type: precision_at_1
808
+ value: 0.482
809
+ - type: precision_at_10
810
+ value: 0.20600000000000002
811
+ - type: precision_at_100
812
+ value: 0.078
813
+ - type: precision_at_1000
814
+ value: 0.023
815
+ - type: precision_at_3
816
+ value: 0.367
817
+ - type: precision_at_5
818
+ value: 0.255
819
+ - type: recall_at_1
820
+ value: 0.338
821
+ - type: recall_at_10
822
+ value: 1.545
823
+ - type: recall_at_100
824
+ value: 5.38
825
+ - type: recall_at_1000
826
+ value: 14.609
827
+ - type: recall_at_3
828
+ value: 0.826
829
+ - type: recall_at_5
830
+ value: 0.975
831
+ - task:
832
+ type: Retrieval
833
+ dataset:
834
+ type: BeIR/cqadupstack
835
+ name: MTEB CQADupstackUnixRetrieval
836
+ config: default
837
+ split: test
838
+ revision: None
839
+ metrics:
840
+ - type: map_at_1
841
+ value: 0.8240000000000001
842
+ - type: map_at_10
843
+ value: 1.254
844
+ - type: map_at_100
845
+ value: 1.389
846
+ - type: map_at_1000
847
+ value: 1.419
848
+ - type: map_at_3
849
+ value: 1.158
850
+ - type: map_at_5
851
+ value: 1.189
852
+ - type: mrr_at_1
853
+ value: 0.9329999999999999
854
+ - type: mrr_at_10
855
+ value: 1.4200000000000002
856
+ - type: mrr_at_100
857
+ value: 1.59
858
+ - type: mrr_at_1000
859
+ value: 1.629
860
+ - type: mrr_at_3
861
+ value: 1.29
862
+ - type: mrr_at_5
863
+ value: 1.332
864
+ - type: ndcg_at_1
865
+ value: 0.9329999999999999
866
+ - type: ndcg_at_10
867
+ value: 1.53
868
+ - type: ndcg_at_100
869
+ value: 2.418
870
+ - type: ndcg_at_1000
871
+ value: 3.7310000000000003
872
+ - type: ndcg_at_3
873
+ value: 1.302
874
+ - type: ndcg_at_5
875
+ value: 1.363
876
+ - type: precision_at_1
877
+ value: 0.9329999999999999
878
+ - type: precision_at_10
879
+ value: 0.271
880
+ - type: precision_at_100
881
+ value: 0.083
882
+ - type: precision_at_1000
883
+ value: 0.024
884
+ - type: precision_at_3
885
+ value: 0.622
886
+ - type: precision_at_5
887
+ value: 0.41000000000000003
888
+ - type: recall_at_1
889
+ value: 0.8240000000000001
890
+ - type: recall_at_10
891
+ value: 2.1999999999999997
892
+ - type: recall_at_100
893
+ value: 6.584
894
+ - type: recall_at_1000
895
+ value: 17.068
896
+ - type: recall_at_3
897
+ value: 1.5859999999999999
898
+ - type: recall_at_5
899
+ value: 1.7260000000000002
900
+ - task:
901
+ type: Retrieval
902
+ dataset:
903
+ type: BeIR/cqadupstack
904
+ name: MTEB CQADupstackWebmastersRetrieval
905
+ config: default
906
+ split: test
907
+ revision: None
908
+ metrics:
909
+ - type: map_at_1
910
+ value: 0.404
911
+ - type: map_at_10
912
+ value: 0.788
913
+ - type: map_at_100
914
+ value: 0.9860000000000001
915
+ - type: map_at_1000
916
+ value: 1.04
917
+ - type: map_at_3
918
+ value: 0.676
919
+ - type: map_at_5
920
+ value: 0.733
921
+ - type: mrr_at_1
922
+ value: 0.5930000000000001
923
+ - type: mrr_at_10
924
+ value: 1.278
925
+ - type: mrr_at_100
926
+ value: 1.545
927
+ - type: mrr_at_1000
928
+ value: 1.599
929
+ - type: mrr_at_3
930
+ value: 1.054
931
+ - type: mrr_at_5
932
+ value: 1.192
933
+ - type: ndcg_at_1
934
+ value: 0.5930000000000001
935
+ - type: ndcg_at_10
936
+ value: 1.1280000000000001
937
+ - type: ndcg_at_100
938
+ value: 2.2689999999999997
939
+ - type: ndcg_at_1000
940
+ value: 4.274
941
+ - type: ndcg_at_3
942
+ value: 0.919
943
+ - type: ndcg_at_5
944
+ value: 1.038
945
+ - type: precision_at_1
946
+ value: 0.5930000000000001
947
+ - type: precision_at_10
948
+ value: 0.296
949
+ - type: precision_at_100
950
+ value: 0.152
951
+ - type: precision_at_1000
952
+ value: 0.05
953
+ - type: precision_at_3
954
+ value: 0.527
955
+ - type: precision_at_5
956
+ value: 0.47400000000000003
957
+ - type: recall_at_1
958
+ value: 0.404
959
+ - type: recall_at_10
960
+ value: 1.601
961
+ - type: recall_at_100
962
+ value: 6.885
963
+ - type: recall_at_1000
964
+ value: 22.356
965
+ - type: recall_at_3
966
+ value: 0.9490000000000001
967
+ - type: recall_at_5
968
+ value: 1.206
969
+ - task:
970
+ type: Retrieval
971
+ dataset:
972
+ type: BeIR/cqadupstack
973
+ name: MTEB CQADupstackWordpressRetrieval
974
+ config: default
975
+ split: test
976
+ revision: None
977
+ metrics:
978
+ - type: map_at_1
979
+ value: 0.185
980
+ - type: map_at_10
981
+ value: 0.192
982
+ - type: map_at_100
983
+ value: 0.271
984
+ - type: map_at_1000
985
+ value: 0.307
986
+ - type: map_at_3
987
+ value: 0.185
988
+ - type: map_at_5
989
+ value: 0.185
990
+ - type: mrr_at_1
991
+ value: 0.185
992
+ - type: mrr_at_10
993
+ value: 0.20500000000000002
994
+ - type: mrr_at_100
995
+ value: 0.292
996
+ - type: mrr_at_1000
997
+ value: 0.331
998
+ - type: mrr_at_3
999
+ value: 0.185
1000
+ - type: mrr_at_5
1001
+ value: 0.185
1002
+ - type: ndcg_at_1
1003
+ value: 0.185
1004
+ - type: ndcg_at_10
1005
+ value: 0.211
1006
+ - type: ndcg_at_100
1007
+ value: 0.757
1008
+ - type: ndcg_at_1000
1009
+ value: 1.928
1010
+ - type: ndcg_at_3
1011
+ value: 0.185
1012
+ - type: ndcg_at_5
1013
+ value: 0.185
1014
+ - type: precision_at_1
1015
+ value: 0.185
1016
+ - type: precision_at_10
1017
+ value: 0.037
1018
+ - type: precision_at_100
1019
+ value: 0.039
1020
+ - type: precision_at_1000
1021
+ value: 0.015
1022
+ - type: precision_at_3
1023
+ value: 0.062
1024
+ - type: precision_at_5
1025
+ value: 0.037
1026
+ - type: recall_at_1
1027
+ value: 0.185
1028
+ - type: recall_at_10
1029
+ value: 0.246
1030
+ - type: recall_at_100
1031
+ value: 3.05
1032
+ - type: recall_at_1000
1033
+ value: 12.5
1034
+ - type: recall_at_3
1035
+ value: 0.185
1036
+ - type: recall_at_5
1037
+ value: 0.185
1038
+ - task:
1039
+ type: Retrieval
1040
+ dataset:
1041
+ type: climate-fever
1042
+ name: MTEB ClimateFEVER
1043
+ config: default
1044
+ split: test
1045
+ revision: None
1046
+ metrics:
1047
+ - type: map_at_1
1048
+ value: 0.241
1049
+ - type: map_at_10
1050
+ value: 0.372
1051
+ - type: map_at_100
1052
+ value: 0.45999999999999996
1053
+ - type: map_at_1000
1054
+ value: 0.47600000000000003
1055
+ - type: map_at_3
1056
+ value: 0.33999999999999997
1057
+ - type: map_at_5
1058
+ value: 0.359
1059
+ - type: mrr_at_1
1060
+ value: 0.651
1061
+ - type: mrr_at_10
1062
+ value: 1.03
1063
+ - type: mrr_at_100
1064
+ value: 1.2489999999999999
1065
+ - type: mrr_at_1000
1066
+ value: 1.282
1067
+ - type: mrr_at_3
1068
+ value: 0.9450000000000001
1069
+ - type: mrr_at_5
1070
+ value: 1.0030000000000001
1071
+ - type: ndcg_at_1
1072
+ value: 0.651
1073
+ - type: ndcg_at_10
1074
+ value: 0.588
1075
+ - type: ndcg_at_100
1076
+ value: 1.2550000000000001
1077
+ - type: ndcg_at_1000
1078
+ value: 1.9040000000000001
1079
+ - type: ndcg_at_3
1080
+ value: 0.547
1081
+ - type: ndcg_at_5
1082
+ value: 0.549
1083
+ - type: precision_at_1
1084
+ value: 0.651
1085
+ - type: precision_at_10
1086
+ value: 0.182
1087
+ - type: precision_at_100
1088
+ value: 0.086
1089
+ - type: precision_at_1000
1090
+ value: 0.02
1091
+ - type: precision_at_3
1092
+ value: 0.434
1093
+ - type: precision_at_5
1094
+ value: 0.313
1095
+ - type: recall_at_1
1096
+ value: 0.241
1097
+ - type: recall_at_10
1098
+ value: 0.63
1099
+ - type: recall_at_100
1100
+ value: 3.1759999999999997
1101
+ - type: recall_at_1000
1102
+ value: 7.175
1103
+ - type: recall_at_3
1104
+ value: 0.46299999999999997
1105
+ - type: recall_at_5
1106
+ value: 0.543
1107
+ - task:
1108
+ type: Retrieval
1109
+ dataset:
1110
+ type: dbpedia-entity
1111
+ name: MTEB DBPedia
1112
+ config: default
1113
+ split: test
1114
+ revision: None
1115
+ metrics:
1116
+ - type: map_at_1
1117
+ value: 0.04
1118
+ - type: map_at_10
1119
+ value: 0.089
1120
+ - type: map_at_100
1121
+ value: 0.133
1122
+ - type: map_at_1000
1123
+ value: 0.165
1124
+ - type: map_at_3
1125
+ value: 0.054
1126
+ - type: map_at_5
1127
+ value: 0.056999999999999995
1128
+ - type: mrr_at_1
1129
+ value: 0.75
1130
+ - type: mrr_at_10
1131
+ value: 1.4749999999999999
1132
+ - type: mrr_at_100
1133
+ value: 1.8010000000000002
1134
+ - type: mrr_at_1000
1135
+ value: 1.847
1136
+ - type: mrr_at_3
1137
+ value: 1.208
1138
+ - type: mrr_at_5
1139
+ value: 1.333
1140
+ - type: ndcg_at_1
1141
+ value: 0.625
1142
+ - type: ndcg_at_10
1143
+ value: 0.428
1144
+ - type: ndcg_at_100
1145
+ value: 0.705
1146
+ - type: ndcg_at_1000
1147
+ value: 1.564
1148
+ - type: ndcg_at_3
1149
+ value: 0.5369999999999999
1150
+ - type: ndcg_at_5
1151
+ value: 0.468
1152
+ - type: precision_at_1
1153
+ value: 0.75
1154
+ - type: precision_at_10
1155
+ value: 0.375
1156
+ - type: precision_at_100
1157
+ value: 0.27499999999999997
1158
+ - type: precision_at_1000
1159
+ value: 0.10300000000000001
1160
+ - type: precision_at_3
1161
+ value: 0.583
1162
+ - type: precision_at_5
1163
+ value: 0.5
1164
+ - type: recall_at_1
1165
+ value: 0.04
1166
+ - type: recall_at_10
1167
+ value: 0.385
1168
+ - type: recall_at_100
1169
+ value: 1.2670000000000001
1170
+ - type: recall_at_1000
1171
+ value: 4.522
1172
+ - type: recall_at_3
1173
+ value: 0.07100000000000001
1174
+ - type: recall_at_5
1175
+ value: 0.08099999999999999
1176
+ - task:
1177
+ type: Classification
1178
+ dataset:
1179
+ type: mteb/emotion
1180
+ name: MTEB EmotionClassification
1181
+ config: default
1182
+ split: test
1183
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1184
+ metrics:
1185
+ - type: accuracy
1186
+ value: 22.749999999999996
1187
+ - type: f1
1188
+ value: 19.335020165482693
1189
+ - task:
1190
+ type: Retrieval
1191
+ dataset:
1192
+ type: fever
1193
+ name: MTEB FEVER
1194
+ config: default
1195
+ split: test
1196
+ revision: None
1197
+ metrics:
1198
+ - type: map_at_1
1199
+ value: 0.257
1200
+ - type: map_at_10
1201
+ value: 0.416
1202
+ - type: map_at_100
1203
+ value: 0.451
1204
+ - type: map_at_1000
1205
+ value: 0.46499999999999997
1206
+ - type: map_at_3
1207
+ value: 0.37
1208
+ - type: map_at_5
1209
+ value: 0.386
1210
+ - type: mrr_at_1
1211
+ value: 0.27
1212
+ - type: mrr_at_10
1213
+ value: 0.44200000000000006
1214
+ - type: mrr_at_100
1215
+ value: 0.48
1216
+ - type: mrr_at_1000
1217
+ value: 0.49500000000000005
1218
+ - type: mrr_at_3
1219
+ value: 0.38999999999999996
1220
+ - type: mrr_at_5
1221
+ value: 0.411
1222
+ - type: ndcg_at_1
1223
+ value: 0.27
1224
+ - type: ndcg_at_10
1225
+ value: 0.51
1226
+ - type: ndcg_at_100
1227
+ value: 0.738
1228
+ - type: ndcg_at_1000
1229
+ value: 1.2630000000000001
1230
+ - type: ndcg_at_3
1231
+ value: 0.41000000000000003
1232
+ - type: ndcg_at_5
1233
+ value: 0.439
1234
+ - type: precision_at_1
1235
+ value: 0.27
1236
+ - type: precision_at_10
1237
+ value: 0.084
1238
+ - type: precision_at_100
1239
+ value: 0.021
1240
+ - type: precision_at_1000
1241
+ value: 0.006999999999999999
1242
+ - type: precision_at_3
1243
+ value: 0.17500000000000002
1244
+ - type: precision_at_5
1245
+ value: 0.123
1246
+ - type: recall_at_1
1247
+ value: 0.257
1248
+ - type: recall_at_10
1249
+ value: 0.786
1250
+ - type: recall_at_100
1251
+ value: 1.959
1252
+ - type: recall_at_1000
1253
+ value: 6.334
1254
+ - type: recall_at_3
1255
+ value: 0.49699999999999994
1256
+ - type: recall_at_5
1257
+ value: 0.5680000000000001
1258
+ - task:
1259
+ type: Retrieval
1260
+ dataset:
1261
+ type: fiqa
1262
+ name: MTEB FiQA2018
1263
+ config: default
1264
+ split: test
1265
+ revision: None
1266
+ metrics:
1267
+ - type: map_at_1
1268
+ value: 0.28900000000000003
1269
+ - type: map_at_10
1270
+ value: 0.475
1271
+ - type: map_at_100
1272
+ value: 0.559
1273
+ - type: map_at_1000
1274
+ value: 0.5930000000000001
1275
+ - type: map_at_3
1276
+ value: 0.38999999999999996
1277
+ - type: map_at_5
1278
+ value: 0.41700000000000004
1279
+ - type: mrr_at_1
1280
+ value: 0.772
1281
+ - type: mrr_at_10
1282
+ value: 1.107
1283
+ - type: mrr_at_100
1284
+ value: 1.269
1285
+ - type: mrr_at_1000
1286
+ value: 1.323
1287
+ - type: mrr_at_3
1288
+ value: 0.9520000000000001
1289
+ - type: mrr_at_5
1290
+ value: 1.0290000000000001
1291
+ - type: ndcg_at_1
1292
+ value: 0.772
1293
+ - type: ndcg_at_10
1294
+ value: 0.755
1295
+ - type: ndcg_at_100
1296
+ value: 1.256
1297
+ - type: ndcg_at_1000
1298
+ value: 2.55
1299
+ - type: ndcg_at_3
1300
+ value: 0.633
1301
+ - type: ndcg_at_5
1302
+ value: 0.639
1303
+ - type: precision_at_1
1304
+ value: 0.772
1305
+ - type: precision_at_10
1306
+ value: 0.262
1307
+ - type: precision_at_100
1308
+ value: 0.082
1309
+ - type: precision_at_1000
1310
+ value: 0.03
1311
+ - type: precision_at_3
1312
+ value: 0.46299999999999997
1313
+ - type: precision_at_5
1314
+ value: 0.33999999999999997
1315
+ - type: recall_at_1
1316
+ value: 0.28900000000000003
1317
+ - type: recall_at_10
1318
+ value: 0.976
1319
+ - type: recall_at_100
1320
+ value: 2.802
1321
+ - type: recall_at_1000
1322
+ value: 11.466
1323
+ - type: recall_at_3
1324
+ value: 0.54
1325
+ - type: recall_at_5
1326
+ value: 0.6479999999999999
1327
+ - task:
1328
+ type: Retrieval
1329
+ dataset:
1330
+ type: hotpotqa
1331
+ name: MTEB HotpotQA
1332
+ config: default
1333
+ split: test
1334
+ revision: None
1335
+ metrics:
1336
+ - type: map_at_1
1337
+ value: 0.257
1338
+ - type: map_at_10
1339
+ value: 0.395
1340
+ - type: map_at_100
1341
+ value: 0.436
1342
+ - type: map_at_1000
1343
+ value: 0.447
1344
+ - type: map_at_3
1345
+ value: 0.347
1346
+ - type: map_at_5
1347
+ value: 0.369
1348
+ - type: mrr_at_1
1349
+ value: 0.513
1350
+ - type: mrr_at_10
1351
+ value: 0.787
1352
+ - type: mrr_at_100
1353
+ value: 0.865
1354
+ - type: mrr_at_1000
1355
+ value: 0.8840000000000001
1356
+ - type: mrr_at_3
1357
+ value: 0.6930000000000001
1358
+ - type: mrr_at_5
1359
+ value: 0.738
1360
+ - type: ndcg_at_1
1361
+ value: 0.513
1362
+ - type: ndcg_at_10
1363
+ value: 0.587
1364
+ - type: ndcg_at_100
1365
+ value: 0.881
1366
+ - type: ndcg_at_1000
1367
+ value: 1.336
1368
+ - type: ndcg_at_3
1369
+ value: 0.46299999999999997
1370
+ - type: ndcg_at_5
1371
+ value: 0.511
1372
+ - type: precision_at_1
1373
+ value: 0.513
1374
+ - type: precision_at_10
1375
+ value: 0.151
1376
+ - type: precision_at_100
1377
+ value: 0.04
1378
+ - type: precision_at_1000
1379
+ value: 0.01
1380
+ - type: precision_at_3
1381
+ value: 0.311
1382
+ - type: precision_at_5
1383
+ value: 0.22399999999999998
1384
+ - type: recall_at_1
1385
+ value: 0.257
1386
+ - type: recall_at_10
1387
+ value: 0.756
1388
+ - type: recall_at_100
1389
+ value: 1.9849999999999999
1390
+ - type: recall_at_1000
1391
+ value: 5.111000000000001
1392
+ - type: recall_at_3
1393
+ value: 0.466
1394
+ - type: recall_at_5
1395
+ value: 0.5599999999999999
1396
+ - task:
1397
+ type: Classification
1398
+ dataset:
1399
+ type: mteb/imdb
1400
+ name: MTEB ImdbClassification
1401
+ config: default
1402
+ split: test
1403
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1404
+ metrics:
1405
+ - type: accuracy
1406
+ value: 50.76400000000001
1407
+ - type: ap
1408
+ value: 50.41569411130455
1409
+ - type: f1
1410
+ value: 50.14266303576945
1411
+ - task:
1412
+ type: Retrieval
1413
+ dataset:
1414
+ type: msmarco
1415
+ name: MTEB MSMARCO
1416
+ config: default
1417
+ split: dev
1418
+ revision: None
1419
+ metrics:
1420
+ - type: map_at_1
1421
+ value: 0.14300000000000002
1422
+ - type: map_at_10
1423
+ value: 0.23700000000000002
1424
+ - type: map_at_100
1425
+ value: 0.27799999999999997
1426
+ - type: map_at_1000
1427
+ value: 0.291
1428
+ - type: map_at_3
1429
+ value: 0.197
1430
+ - type: map_at_5
1431
+ value: 0.215
1432
+ - type: mrr_at_1
1433
+ value: 0.14300000000000002
1434
+ - type: mrr_at_10
1435
+ value: 0.247
1436
+ - type: mrr_at_100
1437
+ value: 0.29
1438
+ - type: mrr_at_1000
1439
+ value: 0.303
1440
+ - type: mrr_at_3
1441
+ value: 0.201
1442
+ - type: mrr_at_5
1443
+ value: 0.219
1444
+ - type: ndcg_at_1
1445
+ value: 0.14300000000000002
1446
+ - type: ndcg_at_10
1447
+ value: 0.307
1448
+ - type: ndcg_at_100
1449
+ value: 0.5720000000000001
1450
+ - type: ndcg_at_1000
1451
+ value: 1.053
1452
+ - type: ndcg_at_3
1453
+ value: 0.215
1454
+ - type: ndcg_at_5
1455
+ value: 0.248
1456
+ - type: precision_at_1
1457
+ value: 0.14300000000000002
1458
+ - type: precision_at_10
1459
+ value: 0.056999999999999995
1460
+ - type: precision_at_100
1461
+ value: 0.02
1462
+ - type: precision_at_1000
1463
+ value: 0.006
1464
+ - type: precision_at_3
1465
+ value: 0.091
1466
+ - type: precision_at_5
1467
+ value: 0.07200000000000001
1468
+ - type: recall_at_1
1469
+ value: 0.14300000000000002
1470
+ - type: recall_at_10
1471
+ value: 0.522
1472
+ - type: recall_at_100
1473
+ value: 1.9009999999999998
1474
+ - type: recall_at_1000
1475
+ value: 5.893000000000001
1476
+ - type: recall_at_3
1477
+ value: 0.263
1478
+ - type: recall_at_5
1479
+ value: 0.34099999999999997
1480
+ - task:
1481
+ type: Classification
1482
+ dataset:
1483
+ type: mteb/mtop_domain
1484
+ name: MTEB MTOPDomainClassification (en)
1485
+ config: en
1486
+ split: test
1487
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1488
+ metrics:
1489
+ - type: accuracy
1490
+ value: 61.03283173734611
1491
+ - type: f1
1492
+ value: 61.24012492746259
1493
+ - task:
1494
+ type: Classification
1495
+ dataset:
1496
+ type: mteb/mtop_intent
1497
+ name: MTEB MTOPIntentClassification (en)
1498
+ config: en
1499
+ split: test
1500
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1501
+ metrics:
1502
+ - type: accuracy
1503
+ value: 29.68308253533972
1504
+ - type: f1
1505
+ value: 16.243459114946905
1506
+ - task:
1507
+ type: Classification
1508
+ dataset:
1509
+ type: mteb/amazon_massive_intent
1510
+ name: MTEB MassiveIntentClassification (en)
1511
+ config: en
1512
+ split: test
1513
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1514
+ metrics:
1515
+ - type: accuracy
1516
+ value: 34.330867518493605
1517
+ - type: f1
1518
+ value: 33.176158044175935
1519
+ - task:
1520
+ type: Classification
1521
+ dataset:
1522
+ type: mteb/amazon_massive_scenario
1523
+ name: MTEB MassiveScenarioClassification (en)
1524
+ config: en
1525
+ split: test
1526
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1527
+ metrics:
1528
+ - type: accuracy
1529
+ value: 44.13248150638871
1530
+ - type: f1
1531
+ value: 44.24904249078732
1532
+ - task:
1533
+ type: Clustering
1534
+ dataset:
1535
+ type: mteb/medrxiv-clustering-p2p
1536
+ name: MTEB MedrxivClusteringP2P
1537
+ config: default
1538
+ split: test
1539
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1540
+ metrics:
1541
+ - type: v_measure
1542
+ value: 15.698400177259078
1543
+ - task:
1544
+ type: Clustering
1545
+ dataset:
1546
+ type: mteb/medrxiv-clustering-s2s
1547
+ name: MTEB MedrxivClusteringS2S
1548
+ config: default
1549
+ split: test
1550
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1551
+ metrics:
1552
+ - type: v_measure
1553
+ value: 14.888797785310235
1554
+ - task:
1555
+ type: Reranking
1556
+ dataset:
1557
+ type: mteb/mind_small
1558
+ name: MTEB MindSmallReranking
1559
+ config: default
1560
+ split: test
1561
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1562
+ metrics:
1563
+ - type: map
1564
+ value: 25.652445385382126
1565
+ - type: mrr
1566
+ value: 25.891573325600227
1567
+ - task:
1568
+ type: Retrieval
1569
+ dataset:
1570
+ type: nfcorpus
1571
+ name: MTEB NFCorpus
1572
+ config: default
1573
+ split: test
1574
+ revision: None
1575
+ metrics:
1576
+ - type: map_at_1
1577
+ value: 0.322
1578
+ - type: map_at_10
1579
+ value: 0.7230000000000001
1580
+ - type: map_at_100
1581
+ value: 1.248
1582
+ - type: map_at_1000
1583
+ value: 1.873
1584
+ - type: map_at_3
1585
+ value: 0.479
1586
+ - type: map_at_5
1587
+ value: 0.5700000000000001
1588
+ - type: mrr_at_1
1589
+ value: 6.502
1590
+ - type: mrr_at_10
1591
+ value: 10.735
1592
+ - type: mrr_at_100
1593
+ value: 11.848
1594
+ - type: mrr_at_1000
1595
+ value: 11.995000000000001
1596
+ - type: mrr_at_3
1597
+ value: 9.391
1598
+ - type: mrr_at_5
1599
+ value: 9.732000000000001
1600
+ - type: ndcg_at_1
1601
+ value: 6.037
1602
+ - type: ndcg_at_10
1603
+ value: 4.873
1604
+ - type: ndcg_at_100
1605
+ value: 5.959
1606
+ - type: ndcg_at_1000
1607
+ value: 14.424000000000001
1608
+ - type: ndcg_at_3
1609
+ value: 5.4559999999999995
1610
+ - type: ndcg_at_5
1611
+ value: 5.074
1612
+ - type: precision_at_1
1613
+ value: 6.192
1614
+ - type: precision_at_10
1615
+ value: 4.458
1616
+ - type: precision_at_100
1617
+ value: 2.5700000000000003
1618
+ - type: precision_at_1000
1619
+ value: 1.3679999999999999
1620
+ - type: precision_at_3
1621
+ value: 5.676
1622
+ - type: precision_at_5
1623
+ value: 4.954
1624
+ - type: recall_at_1
1625
+ value: 0.322
1626
+ - type: recall_at_10
1627
+ value: 1.545
1628
+ - type: recall_at_100
1629
+ value: 8.301
1630
+ - type: recall_at_1000
1631
+ value: 37.294
1632
+ - type: recall_at_3
1633
+ value: 0.623
1634
+ - type: recall_at_5
1635
+ value: 0.865
1636
+ - task:
1637
+ type: Retrieval
1638
+ dataset:
1639
+ type: nq
1640
+ name: MTEB NQ
1641
+ config: default
1642
+ split: test
1643
+ revision: None
1644
+ metrics:
1645
+ - type: map_at_1
1646
+ value: 0.188
1647
+ - type: map_at_10
1648
+ value: 0.27
1649
+ - type: map_at_100
1650
+ value: 0.322
1651
+ - type: map_at_1000
1652
+ value: 0.335
1653
+ - type: map_at_3
1654
+ value: 0.246
1655
+ - type: map_at_5
1656
+ value: 0.246
1657
+ - type: mrr_at_1
1658
+ value: 0.203
1659
+ - type: mrr_at_10
1660
+ value: 0.28300000000000003
1661
+ - type: mrr_at_100
1662
+ value: 0.344
1663
+ - type: mrr_at_1000
1664
+ value: 0.357
1665
+ - type: mrr_at_3
1666
+ value: 0.261
1667
+ - type: mrr_at_5
1668
+ value: 0.261
1669
+ - type: ndcg_at_1
1670
+ value: 0.203
1671
+ - type: ndcg_at_10
1672
+ value: 0.329
1673
+ - type: ndcg_at_100
1674
+ value: 0.628
1675
+ - type: ndcg_at_1000
1676
+ value: 1.0959999999999999
1677
+ - type: ndcg_at_3
1678
+ value: 0.272
1679
+ - type: ndcg_at_5
1680
+ value: 0.272
1681
+ - type: precision_at_1
1682
+ value: 0.203
1683
+ - type: precision_at_10
1684
+ value: 0.055
1685
+ - type: precision_at_100
1686
+ value: 0.024
1687
+ - type: precision_at_1000
1688
+ value: 0.006999999999999999
1689
+ - type: precision_at_3
1690
+ value: 0.116
1691
+ - type: precision_at_5
1692
+ value: 0.06999999999999999
1693
+ - type: recall_at_1
1694
+ value: 0.188
1695
+ - type: recall_at_10
1696
+ value: 0.507
1697
+ - type: recall_at_100
1698
+ value: 1.883
1699
+ - type: recall_at_1000
1700
+ value: 5.609999999999999
1701
+ - type: recall_at_3
1702
+ value: 0.333
1703
+ - type: recall_at_5
1704
+ value: 0.333
1705
+ - task:
1706
+ type: Retrieval
1707
+ dataset:
1708
+ type: quora
1709
+ name: MTEB QuoraRetrieval
1710
+ config: default
1711
+ split: test
1712
+ revision: None
1713
+ metrics:
1714
+ - type: map_at_1
1715
+ value: 24.016000000000002
1716
+ - type: map_at_10
1717
+ value: 28.977999999999998
1718
+ - type: map_at_100
1719
+ value: 29.579
1720
+ - type: map_at_1000
1721
+ value: 29.648999999999997
1722
+ - type: map_at_3
1723
+ value: 27.673
1724
+ - type: map_at_5
1725
+ value: 28.427000000000003
1726
+ - type: mrr_at_1
1727
+ value: 27.93
1728
+ - type: mrr_at_10
1729
+ value: 32.462999999999994
1730
+ - type: mrr_at_100
1731
+ value: 32.993
1732
+ - type: mrr_at_1000
1733
+ value: 33.044000000000004
1734
+ - type: mrr_at_3
1735
+ value: 31.252000000000002
1736
+ - type: mrr_at_5
1737
+ value: 31.968999999999998
1738
+ - type: ndcg_at_1
1739
+ value: 27.96
1740
+ - type: ndcg_at_10
1741
+ value: 31.954
1742
+ - type: ndcg_at_100
1743
+ value: 34.882000000000005
1744
+ - type: ndcg_at_1000
1745
+ value: 36.751
1746
+ - type: ndcg_at_3
1747
+ value: 29.767
1748
+ - type: ndcg_at_5
1749
+ value: 30.816
1750
+ - type: precision_at_1
1751
+ value: 27.96
1752
+ - type: precision_at_10
1753
+ value: 4.826
1754
+ - type: precision_at_100
1755
+ value: 0.697
1756
+ - type: precision_at_1000
1757
+ value: 0.093
1758
+ - type: precision_at_3
1759
+ value: 12.837000000000002
1760
+ - type: precision_at_5
1761
+ value: 8.559999999999999
1762
+ - type: recall_at_1
1763
+ value: 24.016000000000002
1764
+ - type: recall_at_10
1765
+ value: 37.574999999999996
1766
+ - type: recall_at_100
1767
+ value: 50.843
1768
+ - type: recall_at_1000
1769
+ value: 64.654
1770
+ - type: recall_at_3
1771
+ value: 31.182
1772
+ - type: recall_at_5
1773
+ value: 34.055
1774
+ - task:
1775
+ type: Clustering
1776
+ dataset:
1777
+ type: mteb/reddit-clustering
1778
+ name: MTEB RedditClustering
1779
+ config: default
1780
+ split: test
1781
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1782
+ metrics:
1783
+ - type: v_measure
1784
+ value: 18.38048892083281
1785
+ - task:
1786
+ type: Clustering
1787
+ dataset:
1788
+ type: mteb/reddit-clustering-p2p
1789
+ name: MTEB RedditClusteringP2P
1790
+ config: default
1791
+ split: test
1792
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1793
+ metrics:
1794
+ - type: v_measure
1795
+ value: 27.103011764141478
1796
+ - task:
1797
+ type: Retrieval
1798
+ dataset:
1799
+ type: scidocs
1800
+ name: MTEB SCIDOCS
1801
+ config: default
1802
+ split: test
1803
+ revision: None
1804
+ metrics:
1805
+ - type: map_at_1
1806
+ value: 0.18
1807
+ - type: map_at_10
1808
+ value: 0.457
1809
+ - type: map_at_100
1810
+ value: 0.634
1811
+ - type: map_at_1000
1812
+ value: 0.7000000000000001
1813
+ - type: map_at_3
1814
+ value: 0.333
1815
+ - type: map_at_5
1816
+ value: 0.387
1817
+ - type: mrr_at_1
1818
+ value: 0.8999999999999999
1819
+ - type: mrr_at_10
1820
+ value: 1.967
1821
+ - type: mrr_at_100
1822
+ value: 2.396
1823
+ - type: mrr_at_1000
1824
+ value: 2.495
1825
+ - type: mrr_at_3
1826
+ value: 1.567
1827
+ - type: mrr_at_5
1828
+ value: 1.7670000000000001
1829
+ - type: ndcg_at_1
1830
+ value: 0.8999999999999999
1831
+ - type: ndcg_at_10
1832
+ value: 1.022
1833
+ - type: ndcg_at_100
1834
+ value: 2.366
1835
+ - type: ndcg_at_1000
1836
+ value: 4.689
1837
+ - type: ndcg_at_3
1838
+ value: 0.882
1839
+ - type: ndcg_at_5
1840
+ value: 0.7929999999999999
1841
+ - type: precision_at_1
1842
+ value: 0.8999999999999999
1843
+ - type: precision_at_10
1844
+ value: 0.58
1845
+ - type: precision_at_100
1846
+ value: 0.263
1847
+ - type: precision_at_1000
1848
+ value: 0.084
1849
+ - type: precision_at_3
1850
+ value: 0.8999999999999999
1851
+ - type: precision_at_5
1852
+ value: 0.74
1853
+ - type: recall_at_1
1854
+ value: 0.18
1855
+ - type: recall_at_10
1856
+ value: 1.208
1857
+ - type: recall_at_100
1858
+ value: 5.373
1859
+ - type: recall_at_1000
1860
+ value: 17.112
1861
+ - type: recall_at_3
1862
+ value: 0.5579999999999999
1863
+ - type: recall_at_5
1864
+ value: 0.7779999999999999
1865
+ - task:
1866
+ type: STS
1867
+ dataset:
1868
+ type: mteb/sickr-sts
1869
+ name: MTEB SICK-R
1870
+ config: default
1871
+ split: test
1872
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1873
+ metrics:
1874
+ - type: cos_sim_pearson
1875
+ value: 55.229896309578905
1876
+ - type: cos_sim_spearman
1877
+ value: 48.54616726085393
1878
+ - type: euclidean_pearson
1879
+ value: 53.828130644322
1880
+ - type: euclidean_spearman
1881
+ value: 48.2907441223958
1882
+ - type: manhattan_pearson
1883
+ value: 53.72684612327582
1884
+ - type: manhattan_spearman
1885
+ value: 48.228319721712744
1886
+ - task:
1887
+ type: STS
1888
+ dataset:
1889
+ type: mteb/sts12-sts
1890
+ name: MTEB STS12
1891
+ config: default
1892
+ split: test
1893
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1894
+ metrics:
1895
+ - type: cos_sim_pearson
1896
+ value: 57.73555535277214
1897
+ - type: cos_sim_spearman
1898
+ value: 55.58790083939622
1899
+ - type: euclidean_pearson
1900
+ value: 61.009463373795384
1901
+ - type: euclidean_spearman
1902
+ value: 56.696846101196044
1903
+ - type: manhattan_pearson
1904
+ value: 60.875111392597894
1905
+ - type: manhattan_spearman
1906
+ value: 56.63100766160946
1907
+ - task:
1908
+ type: STS
1909
+ dataset:
1910
+ type: mteb/sts13-sts
1911
+ name: MTEB STS13
1912
+ config: default
1913
+ split: test
1914
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1915
+ metrics:
1916
+ - type: cos_sim_pearson
1917
+ value: 19.47269635955134
1918
+ - type: cos_sim_spearman
1919
+ value: 18.35951746300603
1920
+ - type: euclidean_pearson
1921
+ value: 23.130707248318714
1922
+ - type: euclidean_spearman
1923
+ value: 22.92241668287248
1924
+ - type: manhattan_pearson
1925
+ value: 22.99371642148021
1926
+ - type: manhattan_spearman
1927
+ value: 22.770233678121897
1928
+ - task:
1929
+ type: STS
1930
+ dataset:
1931
+ type: mteb/sts14-sts
1932
+ name: MTEB STS14
1933
+ config: default
1934
+ split: test
1935
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1936
+ metrics:
1937
+ - type: cos_sim_pearson
1938
+ value: 31.78346805351368
1939
+ - type: cos_sim_spearman
1940
+ value: 28.84281669682782
1941
+ - type: euclidean_pearson
1942
+ value: 34.508176962091156
1943
+ - type: euclidean_spearman
1944
+ value: 32.269242265609975
1945
+ - type: manhattan_pearson
1946
+ value: 34.41366600914297
1947
+ - type: manhattan_spearman
1948
+ value: 32.15352239729175
1949
+ - task:
1950
+ type: STS
1951
+ dataset:
1952
+ type: mteb/sts15-sts
1953
+ name: MTEB STS15
1954
+ config: default
1955
+ split: test
1956
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1957
+ metrics:
1958
+ - type: cos_sim_pearson
1959
+ value: 29.550332218260465
1960
+ - type: cos_sim_spearman
1961
+ value: 29.188654452524528
1962
+ - type: euclidean_pearson
1963
+ value: 33.80339596511417
1964
+ - type: euclidean_spearman
1965
+ value: 33.49607278843874
1966
+ - type: manhattan_pearson
1967
+ value: 33.589427741967334
1968
+ - type: manhattan_spearman
1969
+ value: 33.288312003652884
1970
+ - task:
1971
+ type: STS
1972
+ dataset:
1973
+ type: mteb/sts16-sts
1974
+ name: MTEB STS16
1975
+ config: default
1976
+ split: test
1977
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1978
+ metrics:
1979
+ - type: cos_sim_pearson
1980
+ value: 27.163752699585885
1981
+ - type: cos_sim_spearman
1982
+ value: 39.0544187582685
1983
+ - type: euclidean_pearson
1984
+ value: 38.93841642732113
1985
+ - type: euclidean_spearman
1986
+ value: 42.861814968921294
1987
+ - type: manhattan_pearson
1988
+ value: 38.78821319739337
1989
+ - type: manhattan_spearman
1990
+ value: 42.757121435678954
1991
+ - task:
1992
+ type: STS
1993
+ dataset:
1994
+ type: mteb/sts17-crosslingual-sts
1995
+ name: MTEB STS17 (en-en)
1996
+ config: en-en
1997
+ split: test
1998
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1999
+ metrics:
2000
+ - type: cos_sim_pearson
2001
+ value: 57.15429605615292
2002
+ - type: cos_sim_spearman
2003
+ value: 61.21576579300284
2004
+ - type: euclidean_pearson
2005
+ value: 59.2835939062064
2006
+ - type: euclidean_spearman
2007
+ value: 60.902713241808236
2008
+ - type: manhattan_pearson
2009
+ value: 59.510770285546364
2010
+ - type: manhattan_spearman
2011
+ value: 61.02979474159327
2012
+ - task:
2013
+ type: STS
2014
+ dataset:
2015
+ type: mteb/sts22-crosslingual-sts
2016
+ name: MTEB STS22 (en)
2017
+ config: en
2018
+ split: test
2019
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2020
+ metrics:
2021
+ - type: cos_sim_pearson
2022
+ value: 41.81726547830133
2023
+ - type: cos_sim_spearman
2024
+ value: 44.45123398124273
2025
+ - type: euclidean_pearson
2026
+ value: 46.44144033159064
2027
+ - type: euclidean_spearman
2028
+ value: 46.61348337508052
2029
+ - type: manhattan_pearson
2030
+ value: 46.48092744041165
2031
+ - type: manhattan_spearman
2032
+ value: 46.78049599791891
2033
+ - task:
2034
+ type: STS
2035
+ dataset:
2036
+ type: mteb/stsbenchmark-sts
2037
+ name: MTEB STSBenchmark
2038
+ config: default
2039
+ split: test
2040
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2041
+ metrics:
2042
+ - type: cos_sim_pearson
2043
+ value: 46.085942179295465
2044
+ - type: cos_sim_spearman
2045
+ value: 44.394736992467365
2046
+ - type: euclidean_pearson
2047
+ value: 47.06981069147408
2048
+ - type: euclidean_spearman
2049
+ value: 45.40499474054004
2050
+ - type: manhattan_pearson
2051
+ value: 46.96497631950794
2052
+ - type: manhattan_spearman
2053
+ value: 45.31936619298336
2054
+ - task:
2055
+ type: Reranking
2056
+ dataset:
2057
+ type: mteb/scidocs-reranking
2058
+ name: MTEB SciDocsRR
2059
+ config: default
2060
+ split: test
2061
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2062
+ metrics:
2063
+ - type: map
2064
+ value: 43.89526517578129
2065
+ - type: mrr
2066
+ value: 64.30753070458954
2067
+ - task:
2068
+ type: Retrieval
2069
+ dataset:
2070
+ type: scifact
2071
+ name: MTEB SciFact
2072
+ config: default
2073
+ split: test
2074
+ revision: None
2075
+ metrics:
2076
+ - type: map_at_1
2077
+ value: 1.417
2078
+ - type: map_at_10
2079
+ value: 2.189
2080
+ - type: map_at_100
2081
+ value: 2.5669999999999997
2082
+ - type: map_at_1000
2083
+ value: 2.662
2084
+ - type: map_at_3
2085
+ value: 1.694
2086
+ - type: map_at_5
2087
+ value: 1.928
2088
+ - type: mrr_at_1
2089
+ value: 1.667
2090
+ - type: mrr_at_10
2091
+ value: 2.4899999999999998
2092
+ - type: mrr_at_100
2093
+ value: 2.8400000000000003
2094
+ - type: mrr_at_1000
2095
+ value: 2.928
2096
+ - type: mrr_at_3
2097
+ value: 1.944
2098
+ - type: mrr_at_5
2099
+ value: 2.178
2100
+ - type: ndcg_at_1
2101
+ value: 1.667
2102
+ - type: ndcg_at_10
2103
+ value: 2.913
2104
+ - type: ndcg_at_100
2105
+ value: 5.482
2106
+ - type: ndcg_at_1000
2107
+ value: 8.731
2108
+ - type: ndcg_at_3
2109
+ value: 1.867
2110
+ - type: ndcg_at_5
2111
+ value: 2.257
2112
+ - type: precision_at_1
2113
+ value: 1.667
2114
+ - type: precision_at_10
2115
+ value: 0.567
2116
+ - type: precision_at_100
2117
+ value: 0.213
2118
+ - type: precision_at_1000
2119
+ value: 0.053
2120
+ - type: precision_at_3
2121
+ value: 0.7779999999999999
2122
+ - type: precision_at_5
2123
+ value: 0.6669999999999999
2124
+ - type: recall_at_1
2125
+ value: 1.417
2126
+ - type: recall_at_10
2127
+ value: 5.028
2128
+ - type: recall_at_100
2129
+ value: 18.5
2130
+ - type: recall_at_1000
2131
+ value: 45.072
2132
+ - type: recall_at_3
2133
+ value: 2.083
2134
+ - type: recall_at_5
2135
+ value: 3.083
2136
+ - task:
2137
+ type: PairClassification
2138
+ dataset:
2139
+ type: mteb/sprintduplicatequestions-pairclassification
2140
+ name: MTEB SprintDuplicateQuestions
2141
+ config: default
2142
+ split: test
2143
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2144
+ metrics:
2145
+ - type: cos_sim_accuracy
2146
+ value: 99.02871287128713
2147
+ - type: cos_sim_ap
2148
+ value: 17.404404071912694
2149
+ - type: cos_sim_f1
2150
+ value: 25.89285714285714
2151
+ - type: cos_sim_precision
2152
+ value: 29.292929292929294
2153
+ - type: cos_sim_recall
2154
+ value: 23.200000000000003
2155
+ - type: dot_accuracy
2156
+ value: 99.0118811881188
2157
+ - type: dot_ap
2158
+ value: 5.4739000785007335
2159
+ - type: dot_f1
2160
+ value: 12.178702570379436
2161
+ - type: dot_precision
2162
+ value: 8.774250440917108
2163
+ - type: dot_recall
2164
+ value: 19.900000000000002
2165
+ - type: euclidean_accuracy
2166
+ value: 99.03663366336633
2167
+ - type: euclidean_ap
2168
+ value: 19.20851069839796
2169
+ - type: euclidean_f1
2170
+ value: 27.16555612506407
2171
+ - type: euclidean_precision
2172
+ value: 27.865404837013667
2173
+ - type: euclidean_recall
2174
+ value: 26.5
2175
+ - type: manhattan_accuracy
2176
+ value: 99.03663366336633
2177
+ - type: manhattan_ap
2178
+ value: 19.12862913626528
2179
+ - type: manhattan_f1
2180
+ value: 26.96629213483146
2181
+ - type: manhattan_precision
2182
+ value: 28.99884925201381
2183
+ - type: manhattan_recall
2184
+ value: 25.2
2185
+ - type: max_accuracy
2186
+ value: 99.03663366336633
2187
+ - type: max_ap
2188
+ value: 19.20851069839796
2189
+ - type: max_f1
2190
+ value: 27.16555612506407
2191
+ - task:
2192
+ type: Clustering
2193
+ dataset:
2194
+ type: mteb/stackexchange-clustering
2195
+ name: MTEB StackExchangeClustering
2196
+ config: default
2197
+ split: test
2198
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2199
+ metrics:
2200
+ - type: v_measure
2201
+ value: 23.657118721775905
2202
+ - task:
2203
+ type: Clustering
2204
+ dataset:
2205
+ type: mteb/stackexchange-clustering-p2p
2206
+ name: MTEB StackExchangeClusteringP2P
2207
+ config: default
2208
+ split: test
2209
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2210
+ metrics:
2211
+ - type: v_measure
2212
+ value: 27.343558395037043
2213
+ - task:
2214
+ type: Reranking
2215
+ dataset:
2216
+ type: mteb/stackoverflowdupquestions-reranking
2217
+ name: MTEB StackOverflowDupQuestions
2218
+ config: default
2219
+ split: test
2220
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2221
+ metrics:
2222
+ - type: map
2223
+ value: 23.346327148080043
2224
+ - type: mrr
2225
+ value: 21.99097063067651
2226
+ - task:
2227
+ type: Retrieval
2228
+ dataset:
2229
+ type: trec-covid
2230
+ name: MTEB TRECCOVID
2231
+ config: default
2232
+ split: test
2233
+ revision: None
2234
+ metrics:
2235
+ - type: map_at_1
2236
+ value: 0.032
2237
+ - type: map_at_10
2238
+ value: 0.157
2239
+ - type: map_at_100
2240
+ value: 0.583
2241
+ - type: map_at_1000
2242
+ value: 1.48
2243
+ - type: map_at_3
2244
+ value: 0.066
2245
+ - type: map_at_5
2246
+ value: 0.105
2247
+ - type: mrr_at_1
2248
+ value: 10.0
2249
+ - type: mrr_at_10
2250
+ value: 16.99
2251
+ - type: mrr_at_100
2252
+ value: 18.284
2253
+ - type: mrr_at_1000
2254
+ value: 18.394
2255
+ - type: mrr_at_3
2256
+ value: 14.000000000000002
2257
+ - type: mrr_at_5
2258
+ value: 15.8
2259
+ - type: ndcg_at_1
2260
+ value: 8.0
2261
+ - type: ndcg_at_10
2262
+ value: 7.504
2263
+ - type: ndcg_at_100
2264
+ value: 5.339
2265
+ - type: ndcg_at_1000
2266
+ value: 6.046
2267
+ - type: ndcg_at_3
2268
+ value: 8.358
2269
+ - type: ndcg_at_5
2270
+ value: 8.142000000000001
2271
+ - type: precision_at_1
2272
+ value: 10.0
2273
+ - type: precision_at_10
2274
+ value: 8.6
2275
+ - type: precision_at_100
2276
+ value: 5.9799999999999995
2277
+ - type: precision_at_1000
2278
+ value: 2.976
2279
+ - type: precision_at_3
2280
+ value: 9.333
2281
+ - type: precision_at_5
2282
+ value: 9.2
2283
+ - type: recall_at_1
2284
+ value: 0.032
2285
+ - type: recall_at_10
2286
+ value: 0.252
2287
+ - type: recall_at_100
2288
+ value: 1.529
2289
+ - type: recall_at_1000
2290
+ value: 6.364
2291
+ - type: recall_at_3
2292
+ value: 0.08499999999999999
2293
+ - type: recall_at_5
2294
+ value: 0.154
2295
+ - task:
2296
+ type: Retrieval
2297
+ dataset:
2298
+ type: webis-touche2020
2299
+ name: MTEB Touche2020
2300
+ config: default
2301
+ split: test
2302
+ revision: None
2303
+ metrics:
2304
+ - type: map_at_1
2305
+ value: 0.44200000000000006
2306
+ - type: map_at_10
2307
+ value: 0.996
2308
+ - type: map_at_100
2309
+ value: 1.317
2310
+ - type: map_at_1000
2311
+ value: 1.624
2312
+ - type: map_at_3
2313
+ value: 0.736
2314
+ - type: map_at_5
2315
+ value: 0.951
2316
+ - type: mrr_at_1
2317
+ value: 4.082
2318
+ - type: mrr_at_10
2319
+ value: 10.102
2320
+ - type: mrr_at_100
2321
+ value: 10.978
2322
+ - type: mrr_at_1000
2323
+ value: 11.1
2324
+ - type: mrr_at_3
2325
+ value: 7.8229999999999995
2326
+ - type: mrr_at_5
2327
+ value: 9.252
2328
+ - type: ndcg_at_1
2329
+ value: 4.082
2330
+ - type: ndcg_at_10
2331
+ value: 3.821
2332
+ - type: ndcg_at_100
2333
+ value: 5.682
2334
+ - type: ndcg_at_1000
2335
+ value: 10.96
2336
+ - type: ndcg_at_3
2337
+ value: 4.813
2338
+ - type: ndcg_at_5
2339
+ value: 4.757
2340
+ - type: precision_at_1
2341
+ value: 4.082
2342
+ - type: precision_at_10
2343
+ value: 3.061
2344
+ - type: precision_at_100
2345
+ value: 1.367
2346
+ - type: precision_at_1000
2347
+ value: 0.46299999999999997
2348
+ - type: precision_at_3
2349
+ value: 4.7620000000000005
2350
+ - type: precision_at_5
2351
+ value: 4.898000000000001
2352
+ - type: recall_at_1
2353
+ value: 0.44200000000000006
2354
+ - type: recall_at_10
2355
+ value: 2.059
2356
+ - type: recall_at_100
2357
+ value: 7.439
2358
+ - type: recall_at_1000
2359
+ value: 25.191000000000003
2360
+ - type: recall_at_3
2361
+ value: 1.095
2362
+ - type: recall_at_5
2363
+ value: 1.725
2364
+ - task:
2365
+ type: Classification
2366
+ dataset:
2367
+ type: mteb/toxic_conversations_50k
2368
+ name: MTEB ToxicConversationsClassification
2369
+ config: default
2370
+ split: test
2371
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2372
+ metrics:
2373
+ - type: accuracy
2374
+ value: 54.925999999999995
2375
+ - type: ap
2376
+ value: 9.658236434063275
2377
+ - type: f1
2378
+ value: 43.469829154993064
2379
+ - task:
2380
+ type: Classification
2381
+ dataset:
2382
+ type: mteb/tweet_sentiment_extraction
2383
+ name: MTEB TweetSentimentExtractionClassification
2384
+ config: default
2385
+ split: test
2386
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2387
+ metrics:
2388
+ - type: accuracy
2389
+ value: 40.7498585172609
2390
+ - type: f1
2391
+ value: 40.720120106546574
2392
+ - task:
2393
+ type: Clustering
2394
+ dataset:
2395
+ type: mteb/twentynewsgroups-clustering
2396
+ name: MTEB TwentyNewsgroupsClustering
2397
+ config: default
2398
+ split: test
2399
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2400
+ metrics:
2401
+ - type: v_measure
2402
+ value: 20.165152514024733
2403
+ - task:
2404
+ type: PairClassification
2405
+ dataset:
2406
+ type: mteb/twittersemeval2015-pairclassification
2407
+ name: MTEB TwitterSemEval2015
2408
+ config: default
2409
+ split: test
2410
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2411
+ metrics:
2412
+ - type: cos_sim_accuracy
2413
+ value: 77.59432556476128
2414
+ - type: cos_sim_ap
2415
+ value: 30.37846072188074
2416
+ - type: cos_sim_f1
2417
+ value: 37.9231242656521
2418
+ - type: cos_sim_precision
2419
+ value: 24.064474898814172
2420
+ - type: cos_sim_recall
2421
+ value: 89.41952506596306
2422
+ - type: dot_accuracy
2423
+ value: 77.42146986946415
2424
+ - type: dot_ap
2425
+ value: 24.073476661930034
2426
+ - type: dot_f1
2427
+ value: 37.710580857735025
2428
+ - type: dot_precision
2429
+ value: 23.61083383243495
2430
+ - type: dot_recall
2431
+ value: 93.61477572559367
2432
+ - type: euclidean_accuracy
2433
+ value: 77.64797043571556
2434
+ - type: euclidean_ap
2435
+ value: 31.892152386237594
2436
+ - type: euclidean_f1
2437
+ value: 38.21154759481647
2438
+ - type: euclidean_precision
2439
+ value: 25.719243766554023
2440
+ - type: euclidean_recall
2441
+ value: 74.30079155672823
2442
+ - type: manhattan_accuracy
2443
+ value: 77.6539309769327
2444
+ - type: manhattan_ap
2445
+ value: 31.89545356309865
2446
+ - type: manhattan_f1
2447
+ value: 38.16428166172855
2448
+ - type: manhattan_precision
2449
+ value: 25.07247577238466
2450
+ - type: manhattan_recall
2451
+ value: 79.86807387862797
2452
+ - type: max_accuracy
2453
+ value: 77.6539309769327
2454
+ - type: max_ap
2455
+ value: 31.89545356309865
2456
+ - type: max_f1
2457
+ value: 38.21154759481647
2458
+ - task:
2459
+ type: PairClassification
2460
+ dataset:
2461
+ type: mteb/twitterurlcorpus-pairclassification
2462
+ name: MTEB TwitterURLCorpus
2463
+ config: default
2464
+ split: test
2465
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2466
+ metrics:
2467
+ - type: cos_sim_accuracy
2468
+ value: 76.56886715566422
2469
+ - type: cos_sim_ap
2470
+ value: 44.04480929059786
2471
+ - type: cos_sim_f1
2472
+ value: 43.73100054674686
2473
+ - type: cos_sim_precision
2474
+ value: 30.540367168647098
2475
+ - type: cos_sim_recall
2476
+ value: 76.97874961502926
2477
+ - type: dot_accuracy
2478
+ value: 74.80110218496526
2479
+ - type: dot_ap
2480
+ value: 26.487746384962758
2481
+ - type: dot_f1
2482
+ value: 40.91940608182585
2483
+ - type: dot_precision
2484
+ value: 25.9157358738502
2485
+ - type: dot_recall
2486
+ value: 97.18201416692331
2487
+ - type: euclidean_accuracy
2488
+ value: 76.97054371870998
2489
+ - type: euclidean_ap
2490
+ value: 47.079120397438416
2491
+ - type: euclidean_f1
2492
+ value: 45.866182572614115
2493
+ - type: euclidean_precision
2494
+ value: 34.580791490692945
2495
+ - type: euclidean_recall
2496
+ value: 68.0859254696643
2497
+ - type: manhattan_accuracy
2498
+ value: 76.96084138626927
2499
+ - type: manhattan_ap
2500
+ value: 47.168701873575976
2501
+ - type: manhattan_f1
2502
+ value: 45.985439966237614
2503
+ - type: manhattan_precision
2504
+ value: 34.974321938693635
2505
+ - type: manhattan_recall
2506
+ value: 67.11579919926086
2507
+ - type: max_accuracy
2508
+ value: 76.97054371870998
2509
+ - type: max_ap
2510
+ value: 47.168701873575976
2511
+ - type: max_f1
2512
+ value: 45.985439966237614
2513
  ---
2514
 
2515
  # {MODEL_NAME}