lmilliken commited on
Commit
3a0f3b0
1 Parent(s): f6d2904

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1699 -0
README.md CHANGED
@@ -8,6 +8,1705 @@ datasets:
8
  - jinaai/negation-dataset
9
  language: en
10
  license: apache-2.0
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
11
  ---
12
 
13
  <br><br>
 
8
  - jinaai/negation-dataset
9
  language: en
10
  license: apache-2.0
11
+ model-index:
12
+ - name: jina-embedding-b-en-v1
13
+ results:
14
+ - task:
15
+ type: Classification
16
+ dataset:
17
+ type: mteb/amazon_counterfactual
18
+ name: MTEB AmazonCounterfactualClassification (en)
19
+ config: en
20
+ split: test
21
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
22
+ metrics:
23
+ - type: accuracy
24
+ value: 66.58208955223881
25
+ - type: ap
26
+ value: 28.455148149555754
27
+ - type: f1
28
+ value: 59.973775371110385
29
+ - task:
30
+ type: Classification
31
+ dataset:
32
+ type: mteb/amazon_polarity
33
+ name: MTEB AmazonPolarityClassification
34
+ config: default
35
+ split: test
36
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
37
+ metrics:
38
+ - type: accuracy
39
+ value: 65.09505
40
+ - type: ap
41
+ value: 61.387245649832614
42
+ - type: f1
43
+ value: 62.96831291412068
44
+ - task:
45
+ type: Classification
46
+ dataset:
47
+ type: mteb/amazon_reviews_multi
48
+ name: MTEB AmazonReviewsClassification (en)
49
+ config: en
50
+ split: test
51
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
52
+ metrics:
53
+ - type: accuracy
54
+ value: 30.633999999999993
55
+ - type: f1
56
+ value: 29.638828990078647
57
+ - task:
58
+ type: Retrieval
59
+ dataset:
60
+ type: arguana
61
+ name: MTEB ArguAna
62
+ config: default
63
+ split: test
64
+ revision: None
65
+ metrics:
66
+ - type: map_at_1
67
+ value: 25.889
68
+ - type: map_at_10
69
+ value: 40.604
70
+ - type: map_at_100
71
+ value: 41.697
72
+ - type: map_at_1000
73
+ value: 41.705999999999996
74
+ - type: map_at_3
75
+ value: 35.217999999999996
76
+ - type: map_at_5
77
+ value: 38.326
78
+ - type: mrr_at_1
79
+ value: 26.245
80
+ - type: mrr_at_10
81
+ value: 40.736
82
+ - type: mrr_at_100
83
+ value: 41.829
84
+ - type: mrr_at_1000
85
+ value: 41.837999999999994
86
+ - type: mrr_at_3
87
+ value: 35.349000000000004
88
+ - type: mrr_at_5
89
+ value: 38.425
90
+ - type: ndcg_at_1
91
+ value: 25.889
92
+ - type: ndcg_at_10
93
+ value: 49.347
94
+ - type: ndcg_at_100
95
+ value: 53.956
96
+ - type: ndcg_at_1000
97
+ value: 54.2
98
+ - type: ndcg_at_3
99
+ value: 38.282
100
+ - type: ndcg_at_5
101
+ value: 43.895
102
+ - type: precision_at_1
103
+ value: 25.889
104
+ - type: precision_at_10
105
+ value: 7.752000000000001
106
+ - type: precision_at_100
107
+ value: 0.976
108
+ - type: precision_at_1000
109
+ value: 0.1
110
+ - type: precision_at_3
111
+ value: 15.717999999999998
112
+ - type: precision_at_5
113
+ value: 12.162
114
+ - type: recall_at_1
115
+ value: 25.889
116
+ - type: recall_at_10
117
+ value: 77.525
118
+ - type: recall_at_100
119
+ value: 97.58200000000001
120
+ - type: recall_at_1000
121
+ value: 99.502
122
+ - type: recall_at_3
123
+ value: 47.155
124
+ - type: recall_at_5
125
+ value: 60.81100000000001
126
+ - task:
127
+ type: Clustering
128
+ dataset:
129
+ type: mteb/arxiv-clustering-p2p
130
+ name: MTEB ArxivClusteringP2P
131
+ config: default
132
+ split: test
133
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
134
+ metrics:
135
+ - type: v_measure
136
+ value: 39.2179862062943
137
+ - task:
138
+ type: Clustering
139
+ dataset:
140
+ type: mteb/arxiv-clustering-s2s
141
+ name: MTEB ArxivClusteringS2S
142
+ config: default
143
+ split: test
144
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
145
+ metrics:
146
+ - type: v_measure
147
+ value: 29.87826673088078
148
+ - task:
149
+ type: Reranking
150
+ dataset:
151
+ type: mteb/askubuntudupquestions-reranking
152
+ name: MTEB AskUbuntuDupQuestions
153
+ config: default
154
+ split: test
155
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
156
+ metrics:
157
+ - type: map
158
+ value: 62.72401299412015
159
+ - type: mrr
160
+ value: 75.45167743921206
161
+ - task:
162
+ type: STS
163
+ dataset:
164
+ type: mteb/biosses-sts
165
+ name: MTEB BIOSSES
166
+ config: default
167
+ split: test
168
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
169
+ metrics:
170
+ - type: cos_sim_pearson
171
+ value: 85.96510928112639
172
+ - type: cos_sim_spearman
173
+ value: 82.64224450538681
174
+ - type: euclidean_pearson
175
+ value: 52.03458755006108
176
+ - type: euclidean_spearman
177
+ value: 52.83192670285616
178
+ - type: manhattan_pearson
179
+ value: 52.14561955040935
180
+ - type: manhattan_spearman
181
+ value: 52.9584356095438
182
+ - task:
183
+ type: Classification
184
+ dataset:
185
+ type: mteb/banking77
186
+ name: MTEB Banking77Classification
187
+ config: default
188
+ split: test
189
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
190
+ metrics:
191
+ - type: accuracy
192
+ value: 84.11363636363636
193
+ - type: f1
194
+ value: 84.01098114920124
195
+ - task:
196
+ type: Clustering
197
+ dataset:
198
+ type: mteb/biorxiv-clustering-p2p
199
+ name: MTEB BiorxivClusteringP2P
200
+ config: default
201
+ split: test
202
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
203
+ metrics:
204
+ - type: v_measure
205
+ value: 32.991971466919026
206
+ - task:
207
+ type: Clustering
208
+ dataset:
209
+ type: mteb/biorxiv-clustering-s2s
210
+ name: MTEB BiorxivClusteringS2S
211
+ config: default
212
+ split: test
213
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
214
+ metrics:
215
+ - type: v_measure
216
+ value: 26.48807922559519
217
+ - task:
218
+ type: Retrieval
219
+ dataset:
220
+ type: climate-fever
221
+ name: MTEB ClimateFEVER
222
+ config: default
223
+ split: test
224
+ revision: None
225
+ metrics:
226
+ - type: map_at_1
227
+ value: 8.014000000000001
228
+ - type: map_at_10
229
+ value: 14.149999999999999
230
+ - type: map_at_100
231
+ value: 15.539
232
+ - type: map_at_1000
233
+ value: 15.711
234
+ - type: map_at_3
235
+ value: 11.913
236
+ - type: map_at_5
237
+ value: 12.982
238
+ - type: mrr_at_1
239
+ value: 18.046
240
+ - type: mrr_at_10
241
+ value: 28.224
242
+ - type: mrr_at_100
243
+ value: 29.293000000000003
244
+ - type: mrr_at_1000
245
+ value: 29.348999999999997
246
+ - type: mrr_at_3
247
+ value: 25.179000000000002
248
+ - type: mrr_at_5
249
+ value: 26.827
250
+ - type: ndcg_at_1
251
+ value: 18.046
252
+ - type: ndcg_at_10
253
+ value: 20.784
254
+ - type: ndcg_at_100
255
+ value: 26.939999999999998
256
+ - type: ndcg_at_1000
257
+ value: 30.453999999999997
258
+ - type: ndcg_at_3
259
+ value: 16.694
260
+ - type: ndcg_at_5
261
+ value: 18.049
262
+ - type: precision_at_1
263
+ value: 18.046
264
+ - type: precision_at_10
265
+ value: 6.5280000000000005
266
+ - type: precision_at_100
267
+ value: 1.2959999999999998
268
+ - type: precision_at_1000
269
+ value: 0.19499999999999998
270
+ - type: precision_at_3
271
+ value: 12.465
272
+ - type: precision_at_5
273
+ value: 9.511
274
+ - type: recall_at_1
275
+ value: 8.014000000000001
276
+ - type: recall_at_10
277
+ value: 26.021
278
+ - type: recall_at_100
279
+ value: 47.692
280
+ - type: recall_at_1000
281
+ value: 67.63
282
+ - type: recall_at_3
283
+ value: 16.122
284
+ - type: recall_at_5
285
+ value: 19.817
286
+ - task:
287
+ type: Retrieval
288
+ dataset:
289
+ type: dbpedia-entity
290
+ name: MTEB DBPedia
291
+ config: default
292
+ split: test
293
+ revision: None
294
+ metrics:
295
+ - type: map_at_1
296
+ value: 7.396
297
+ - type: map_at_10
298
+ value: 14.543000000000001
299
+ - type: map_at_100
300
+ value: 19.235
301
+ - type: map_at_1000
302
+ value: 20.384
303
+ - type: map_at_3
304
+ value: 10.886
305
+ - type: map_at_5
306
+ value: 12.61
307
+ - type: mrr_at_1
308
+ value: 55.50000000000001
309
+ - type: mrr_at_10
310
+ value: 63.731
311
+ - type: mrr_at_100
312
+ value: 64.256
313
+ - type: mrr_at_1000
314
+ value: 64.27000000000001
315
+ - type: mrr_at_3
316
+ value: 61.583
317
+ - type: mrr_at_5
318
+ value: 62.92100000000001
319
+ - type: ndcg_at_1
320
+ value: 43.375
321
+ - type: ndcg_at_10
322
+ value: 31.352000000000004
323
+ - type: ndcg_at_100
324
+ value: 34.717999999999996
325
+ - type: ndcg_at_1000
326
+ value: 41.959
327
+ - type: ndcg_at_3
328
+ value: 35.319
329
+ - type: ndcg_at_5
330
+ value: 33.222
331
+ - type: precision_at_1
332
+ value: 55.50000000000001
333
+ - type: precision_at_10
334
+ value: 24.15
335
+ - type: precision_at_100
336
+ value: 7.42
337
+ - type: precision_at_1000
338
+ value: 1.66
339
+ - type: precision_at_3
340
+ value: 37.917
341
+ - type: precision_at_5
342
+ value: 31.900000000000002
343
+ - type: recall_at_1
344
+ value: 7.396
345
+ - type: recall_at_10
346
+ value: 19.686999999999998
347
+ - type: recall_at_100
348
+ value: 40.465
349
+ - type: recall_at_1000
350
+ value: 63.79899999999999
351
+ - type: recall_at_3
352
+ value: 12.124
353
+ - type: recall_at_5
354
+ value: 15.28
355
+ - task:
356
+ type: Classification
357
+ dataset:
358
+ type: mteb/emotion
359
+ name: MTEB EmotionClassification
360
+ config: default
361
+ split: test
362
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
363
+ metrics:
364
+ - type: accuracy
365
+ value: 41.33
366
+ - type: f1
367
+ value: 37.682972473685496
368
+ - task:
369
+ type: Retrieval
370
+ dataset:
371
+ type: fever
372
+ name: MTEB FEVER
373
+ config: default
374
+ split: test
375
+ revision: None
376
+ metrics:
377
+ - type: map_at_1
378
+ value: 49.019
379
+ - type: map_at_10
380
+ value: 61.219
381
+ - type: map_at_100
382
+ value: 61.753
383
+ - type: map_at_1000
384
+ value: 61.771
385
+ - type: map_at_3
386
+ value: 58.952000000000005
387
+ - type: map_at_5
388
+ value: 60.239
389
+ - type: mrr_at_1
390
+ value: 53.0
391
+ - type: mrr_at_10
392
+ value: 65.678
393
+ - type: mrr_at_100
394
+ value: 66.147
395
+ - type: mrr_at_1000
396
+ value: 66.155
397
+ - type: mrr_at_3
398
+ value: 63.495999999999995
399
+ - type: mrr_at_5
400
+ value: 64.75800000000001
401
+ - type: ndcg_at_1
402
+ value: 53.0
403
+ - type: ndcg_at_10
404
+ value: 67.587
405
+ - type: ndcg_at_100
406
+ value: 69.877
407
+ - type: ndcg_at_1000
408
+ value: 70.25200000000001
409
+ - type: ndcg_at_3
410
+ value: 63.174
411
+ - type: ndcg_at_5
412
+ value: 65.351
413
+ - type: precision_at_1
414
+ value: 53.0
415
+ - type: precision_at_10
416
+ value: 9.067
417
+ - type: precision_at_100
418
+ value: 1.026
419
+ - type: precision_at_1000
420
+ value: 0.107
421
+ - type: precision_at_3
422
+ value: 25.728
423
+ - type: precision_at_5
424
+ value: 16.637
425
+ - type: recall_at_1
426
+ value: 49.019
427
+ - type: recall_at_10
428
+ value: 82.962
429
+ - type: recall_at_100
430
+ value: 92.917
431
+ - type: recall_at_1000
432
+ value: 95.511
433
+ - type: recall_at_3
434
+ value: 70.838
435
+ - type: recall_at_5
436
+ value: 76.201
437
+ - task:
438
+ type: Retrieval
439
+ dataset:
440
+ type: fiqa
441
+ name: MTEB FiQA2018
442
+ config: default
443
+ split: test
444
+ revision: None
445
+ metrics:
446
+ - type: map_at_1
447
+ value: 16.714000000000002
448
+ - type: map_at_10
449
+ value: 28.041
450
+ - type: map_at_100
451
+ value: 29.75
452
+ - type: map_at_1000
453
+ value: 29.944
454
+ - type: map_at_3
455
+ value: 23.884
456
+ - type: map_at_5
457
+ value: 26.468000000000004
458
+ - type: mrr_at_1
459
+ value: 33.796
460
+ - type: mrr_at_10
461
+ value: 42.757
462
+ - type: mrr_at_100
463
+ value: 43.705
464
+ - type: mrr_at_1000
465
+ value: 43.751
466
+ - type: mrr_at_3
467
+ value: 40.406
468
+ - type: mrr_at_5
469
+ value: 41.88
470
+ - type: ndcg_at_1
471
+ value: 33.796
472
+ - type: ndcg_at_10
473
+ value: 35.482
474
+ - type: ndcg_at_100
475
+ value: 42.44
476
+ - type: ndcg_at_1000
477
+ value: 45.903
478
+ - type: ndcg_at_3
479
+ value: 31.922
480
+ - type: ndcg_at_5
481
+ value: 33.516
482
+ - type: precision_at_1
483
+ value: 33.796
484
+ - type: precision_at_10
485
+ value: 10.108
486
+ - type: precision_at_100
487
+ value: 1.735
488
+ - type: precision_at_1000
489
+ value: 0.23500000000000001
490
+ - type: precision_at_3
491
+ value: 21.759
492
+ - type: precision_at_5
493
+ value: 16.605
494
+ - type: recall_at_1
495
+ value: 16.714000000000002
496
+ - type: recall_at_10
497
+ value: 42.38
498
+ - type: recall_at_100
499
+ value: 68.84700000000001
500
+ - type: recall_at_1000
501
+ value: 90.036
502
+ - type: recall_at_3
503
+ value: 28.776000000000003
504
+ - type: recall_at_5
505
+ value: 35.606
506
+ - task:
507
+ type: Retrieval
508
+ dataset:
509
+ type: hotpotqa
510
+ name: MTEB HotpotQA
511
+ config: default
512
+ split: test
513
+ revision: None
514
+ metrics:
515
+ - type: map_at_1
516
+ value: 29.534
517
+ - type: map_at_10
518
+ value: 40.857
519
+ - type: map_at_100
520
+ value: 41.715999999999994
521
+ - type: map_at_1000
522
+ value: 41.795
523
+ - type: map_at_3
524
+ value: 38.415
525
+ - type: map_at_5
526
+ value: 39.833
527
+ - type: mrr_at_1
528
+ value: 59.068
529
+ - type: mrr_at_10
530
+ value: 66.034
531
+ - type: mrr_at_100
532
+ value: 66.479
533
+ - type: mrr_at_1000
534
+ value: 66.50399999999999
535
+ - type: mrr_at_3
536
+ value: 64.38000000000001
537
+ - type: mrr_at_5
538
+ value: 65.40599999999999
539
+ - type: ndcg_at_1
540
+ value: 59.068
541
+ - type: ndcg_at_10
542
+ value: 49.638
543
+ - type: ndcg_at_100
544
+ value: 53.093999999999994
545
+ - type: ndcg_at_1000
546
+ value: 54.813
547
+ - type: ndcg_at_3
548
+ value: 45.537
549
+ - type: ndcg_at_5
550
+ value: 47.671
551
+ - type: precision_at_1
552
+ value: 59.068
553
+ - type: precision_at_10
554
+ value: 10.313
555
+ - type: precision_at_100
556
+ value: 1.304
557
+ - type: precision_at_1000
558
+ value: 0.153
559
+ - type: precision_at_3
560
+ value: 28.278
561
+ - type: precision_at_5
562
+ value: 18.658
563
+ - type: recall_at_1
564
+ value: 29.534
565
+ - type: recall_at_10
566
+ value: 51.56699999999999
567
+ - type: recall_at_100
568
+ value: 65.199
569
+ - type: recall_at_1000
570
+ value: 76.678
571
+ - type: recall_at_3
572
+ value: 42.417
573
+ - type: recall_at_5
574
+ value: 46.644000000000005
575
+ - task:
576
+ type: Classification
577
+ dataset:
578
+ type: mteb/imdb
579
+ name: MTEB ImdbClassification
580
+ config: default
581
+ split: test
582
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
583
+ metrics:
584
+ - type: accuracy
585
+ value: 65.74719999999999
586
+ - type: ap
587
+ value: 60.57322504947344
588
+ - type: f1
589
+ value: 65.37875006542282
590
+ - task:
591
+ type: Retrieval
592
+ dataset:
593
+ type: msmarco
594
+ name: MTEB MSMARCO
595
+ config: default
596
+ split: dev
597
+ revision: None
598
+ metrics:
599
+ - type: map_at_1
600
+ value: 15.695999999999998
601
+ - type: map_at_10
602
+ value: 26.661
603
+ - type: map_at_100
604
+ value: 27.982000000000003
605
+ - type: map_at_1000
606
+ value: 28.049000000000003
607
+ - type: map_at_3
608
+ value: 23.057
609
+ - type: map_at_5
610
+ value: 25.079
611
+ - type: mrr_at_1
612
+ value: 16.16
613
+ - type: mrr_at_10
614
+ value: 27.150999999999996
615
+ - type: mrr_at_100
616
+ value: 28.423
617
+ - type: mrr_at_1000
618
+ value: 28.483999999999998
619
+ - type: mrr_at_3
620
+ value: 23.577
621
+ - type: mrr_at_5
622
+ value: 25.585
623
+ - type: ndcg_at_1
624
+ value: 16.16
625
+ - type: ndcg_at_10
626
+ value: 33.017
627
+ - type: ndcg_at_100
628
+ value: 39.582
629
+ - type: ndcg_at_1000
630
+ value: 41.28
631
+ - type: ndcg_at_3
632
+ value: 25.607000000000003
633
+ - type: ndcg_at_5
634
+ value: 29.214000000000002
635
+ - type: precision_at_1
636
+ value: 16.16
637
+ - type: precision_at_10
638
+ value: 5.506
639
+ - type: precision_at_100
640
+ value: 0.882
641
+ - type: precision_at_1000
642
+ value: 0.10300000000000001
643
+ - type: precision_at_3
644
+ value: 11.199
645
+ - type: precision_at_5
646
+ value: 8.55
647
+ - type: recall_at_1
648
+ value: 15.695999999999998
649
+ - type: recall_at_10
650
+ value: 52.736000000000004
651
+ - type: recall_at_100
652
+ value: 83.523
653
+ - type: recall_at_1000
654
+ value: 96.588
655
+ - type: recall_at_3
656
+ value: 32.484
657
+ - type: recall_at_5
658
+ value: 41.117
659
+ - task:
660
+ type: Classification
661
+ dataset:
662
+ type: mteb/mtop_domain
663
+ name: MTEB MTOPDomainClassification (en)
664
+ config: en
665
+ split: test
666
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
667
+ metrics:
668
+ - type: accuracy
669
+ value: 91.71682626538988
670
+ - type: f1
671
+ value: 91.60647677401211
672
+ - task:
673
+ type: Classification
674
+ dataset:
675
+ type: mteb/mtop_intent
676
+ name: MTEB MTOPIntentClassification (en)
677
+ config: en
678
+ split: test
679
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
680
+ metrics:
681
+ - type: accuracy
682
+ value: 74.94756041951665
683
+ - type: f1
684
+ value: 57.26936028487369
685
+ - task:
686
+ type: Classification
687
+ dataset:
688
+ type: mteb/amazon_massive_intent
689
+ name: MTEB MassiveIntentClassification (en)
690
+ config: en
691
+ split: test
692
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
693
+ metrics:
694
+ - type: accuracy
695
+ value: 71.43241425689307
696
+ - type: f1
697
+ value: 68.80370629448252
698
+ - task:
699
+ type: Classification
700
+ dataset:
701
+ type: mteb/amazon_massive_scenario
702
+ name: MTEB MassiveScenarioClassification (en)
703
+ config: en
704
+ split: test
705
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
706
+ metrics:
707
+ - type: accuracy
708
+ value: 77.04774714189642
709
+ - type: f1
710
+ value: 76.93545888412446
711
+ - task:
712
+ type: Clustering
713
+ dataset:
714
+ type: mteb/medrxiv-clustering-p2p
715
+ name: MTEB MedrxivClusteringP2P
716
+ config: default
717
+ split: test
718
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
719
+ metrics:
720
+ - type: v_measure
721
+ value: 30.009784989313765
722
+ - task:
723
+ type: Clustering
724
+ dataset:
725
+ type: mteb/medrxiv-clustering-s2s
726
+ name: MTEB MedrxivClusteringS2S
727
+ config: default
728
+ split: test
729
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
730
+ metrics:
731
+ - type: v_measure
732
+ value: 25.568442512328872
733
+ - task:
734
+ type: Reranking
735
+ dataset:
736
+ type: mteb/mind_small
737
+ name: MTEB MindSmallReranking
738
+ config: default
739
+ split: test
740
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
741
+ metrics:
742
+ - type: map
743
+ value: 31.013959341949697
744
+ - type: mrr
745
+ value: 31.998487836684575
746
+ - task:
747
+ type: Retrieval
748
+ dataset:
749
+ type: nfcorpus
750
+ name: MTEB NFCorpus
751
+ config: default
752
+ split: test
753
+ revision: None
754
+ metrics:
755
+ - type: map_at_1
756
+ value: 4.316
757
+ - type: map_at_10
758
+ value: 10.287
759
+ - type: map_at_100
760
+ value: 12.817
761
+ - type: map_at_1000
762
+ value: 14.141
763
+ - type: map_at_3
764
+ value: 7.728
765
+ - type: map_at_5
766
+ value: 8.876000000000001
767
+ - type: mrr_at_1
768
+ value: 39.628
769
+ - type: mrr_at_10
770
+ value: 48.423
771
+ - type: mrr_at_100
772
+ value: 49.153999999999996
773
+ - type: mrr_at_1000
774
+ value: 49.198
775
+ - type: mrr_at_3
776
+ value: 45.666000000000004
777
+ - type: mrr_at_5
778
+ value: 47.477000000000004
779
+ - type: ndcg_at_1
780
+ value: 36.533
781
+ - type: ndcg_at_10
782
+ value: 29.304000000000002
783
+ - type: ndcg_at_100
784
+ value: 27.078000000000003
785
+ - type: ndcg_at_1000
786
+ value: 36.221
787
+ - type: ndcg_at_3
788
+ value: 33.256
789
+ - type: ndcg_at_5
790
+ value: 31.465
791
+ - type: precision_at_1
792
+ value: 39.009
793
+ - type: precision_at_10
794
+ value: 22.043
795
+ - type: precision_at_100
796
+ value: 7.115
797
+ - type: precision_at_1000
798
+ value: 1.991
799
+ - type: precision_at_3
800
+ value: 31.476
801
+ - type: precision_at_5
802
+ value: 27.616000000000003
803
+ - type: recall_at_1
804
+ value: 4.316
805
+ - type: recall_at_10
806
+ value: 14.507
807
+ - type: recall_at_100
808
+ value: 28.847
809
+ - type: recall_at_1000
810
+ value: 61.758
811
+ - type: recall_at_3
812
+ value: 8.753
813
+ - type: recall_at_5
814
+ value: 11.153
815
+ - task:
816
+ type: Retrieval
817
+ dataset:
818
+ type: nq
819
+ name: MTEB NQ
820
+ config: default
821
+ split: test
822
+ revision: None
823
+ metrics:
824
+ - type: map_at_1
825
+ value: 22.374
826
+ - type: map_at_10
827
+ value: 36.095
828
+ - type: map_at_100
829
+ value: 37.413999999999994
830
+ - type: map_at_1000
831
+ value: 37.46
832
+ - type: map_at_3
833
+ value: 31.711
834
+ - type: map_at_5
835
+ value: 34.294999999999995
836
+ - type: mrr_at_1
837
+ value: 25.406000000000002
838
+ - type: mrr_at_10
839
+ value: 38.424
840
+ - type: mrr_at_100
841
+ value: 39.456
842
+ - type: mrr_at_1000
843
+ value: 39.488
844
+ - type: mrr_at_3
845
+ value: 34.613
846
+ - type: mrr_at_5
847
+ value: 36.864999999999995
848
+ - type: ndcg_at_1
849
+ value: 25.406000000000002
850
+ - type: ndcg_at_10
851
+ value: 43.614000000000004
852
+ - type: ndcg_at_100
853
+ value: 49.166
854
+ - type: ndcg_at_1000
855
+ value: 50.212
856
+ - type: ndcg_at_3
857
+ value: 35.221999999999994
858
+ - type: ndcg_at_5
859
+ value: 39.571
860
+ - type: precision_at_1
861
+ value: 25.406000000000002
862
+ - type: precision_at_10
863
+ value: 7.654
864
+ - type: precision_at_100
865
+ value: 1.0699999999999998
866
+ - type: precision_at_1000
867
+ value: 0.117
868
+ - type: precision_at_3
869
+ value: 16.425
870
+ - type: precision_at_5
871
+ value: 12.352
872
+ - type: recall_at_1
873
+ value: 22.374
874
+ - type: recall_at_10
875
+ value: 64.337
876
+ - type: recall_at_100
877
+ value: 88.374
878
+ - type: recall_at_1000
879
+ value: 96.101
880
+ - type: recall_at_3
881
+ value: 42.5
882
+ - type: recall_at_5
883
+ value: 52.556000000000004
884
+ - task:
885
+ type: Retrieval
886
+ dataset:
887
+ type: quora
888
+ name: MTEB QuoraRetrieval
889
+ config: default
890
+ split: test
891
+ revision: None
892
+ metrics:
893
+ - type: map_at_1
894
+ value: 69.301
895
+ - type: map_at_10
896
+ value: 83.128
897
+ - type: map_at_100
898
+ value: 83.779
899
+ - type: map_at_1000
900
+ value: 83.798
901
+ - type: map_at_3
902
+ value: 80.11399999999999
903
+ - type: map_at_5
904
+ value: 82.00699999999999
905
+ - type: mrr_at_1
906
+ value: 79.81
907
+ - type: mrr_at_10
908
+ value: 86.28
909
+ - type: mrr_at_100
910
+ value: 86.399
911
+ - type: mrr_at_1000
912
+ value: 86.401
913
+ - type: mrr_at_3
914
+ value: 85.26
915
+ - type: mrr_at_5
916
+ value: 85.93499999999999
917
+ - type: ndcg_at_1
918
+ value: 79.80000000000001
919
+ - type: ndcg_at_10
920
+ value: 87.06700000000001
921
+ - type: ndcg_at_100
922
+ value: 88.41799999999999
923
+ - type: ndcg_at_1000
924
+ value: 88.554
925
+ - type: ndcg_at_3
926
+ value: 84.052
927
+ - type: ndcg_at_5
928
+ value: 85.711
929
+ - type: precision_at_1
930
+ value: 79.80000000000001
931
+ - type: precision_at_10
932
+ value: 13.224
933
+ - type: precision_at_100
934
+ value: 1.5230000000000001
935
+ - type: precision_at_1000
936
+ value: 0.157
937
+ - type: precision_at_3
938
+ value: 36.723
939
+ - type: precision_at_5
940
+ value: 24.192
941
+ - type: recall_at_1
942
+ value: 69.301
943
+ - type: recall_at_10
944
+ value: 94.589
945
+ - type: recall_at_100
946
+ value: 99.29299999999999
947
+ - type: recall_at_1000
948
+ value: 99.965
949
+ - type: recall_at_3
950
+ value: 86.045
951
+ - type: recall_at_5
952
+ value: 90.656
953
+ - task:
954
+ type: Clustering
955
+ dataset:
956
+ type: mteb/reddit-clustering
957
+ name: MTEB RedditClustering
958
+ config: default
959
+ split: test
960
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
961
+ metrics:
962
+ - type: v_measure
963
+ value: 43.09903181165838
964
+ - task:
965
+ type: Clustering
966
+ dataset:
967
+ type: mteb/reddit-clustering-p2p
968
+ name: MTEB RedditClusteringP2P
969
+ config: default
970
+ split: test
971
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
972
+ metrics:
973
+ - type: v_measure
974
+ value: 51.710378422887594
975
+ - task:
976
+ type: Retrieval
977
+ dataset:
978
+ type: scidocs
979
+ name: MTEB SCIDOCS
980
+ config: default
981
+ split: test
982
+ revision: None
983
+ metrics:
984
+ - type: map_at_1
985
+ value: 4.138
986
+ - type: map_at_10
987
+ value: 10.419
988
+ - type: map_at_100
989
+ value: 12.321
990
+ - type: map_at_1000
991
+ value: 12.605
992
+ - type: map_at_3
993
+ value: 7.445
994
+ - type: map_at_5
995
+ value: 8.859
996
+ - type: mrr_at_1
997
+ value: 20.4
998
+ - type: mrr_at_10
999
+ value: 30.148999999999997
1000
+ - type: mrr_at_100
1001
+ value: 31.357000000000003
1002
+ - type: mrr_at_1000
1003
+ value: 31.424999999999997
1004
+ - type: mrr_at_3
1005
+ value: 26.983
1006
+ - type: mrr_at_5
1007
+ value: 28.883
1008
+ - type: ndcg_at_1
1009
+ value: 20.4
1010
+ - type: ndcg_at_10
1011
+ value: 17.713
1012
+ - type: ndcg_at_100
1013
+ value: 25.221
1014
+ - type: ndcg_at_1000
1015
+ value: 30.381999999999998
1016
+ - type: ndcg_at_3
1017
+ value: 16.607
1018
+ - type: ndcg_at_5
1019
+ value: 14.559
1020
+ - type: precision_at_1
1021
+ value: 20.4
1022
+ - type: precision_at_10
1023
+ value: 9.3
1024
+ - type: precision_at_100
1025
+ value: 2.0060000000000002
1026
+ - type: precision_at_1000
1027
+ value: 0.32399999999999995
1028
+ - type: precision_at_3
1029
+ value: 15.5
1030
+ - type: precision_at_5
1031
+ value: 12.839999999999998
1032
+ - type: recall_at_1
1033
+ value: 4.138
1034
+ - type: recall_at_10
1035
+ value: 18.813
1036
+ - type: recall_at_100
1037
+ value: 40.692
1038
+ - type: recall_at_1000
1039
+ value: 65.835
1040
+ - type: recall_at_3
1041
+ value: 9.418
1042
+ - type: recall_at_5
1043
+ value: 12.983
1044
+ - task:
1045
+ type: STS
1046
+ dataset:
1047
+ type: mteb/sickr-sts
1048
+ name: MTEB SICK-R
1049
+ config: default
1050
+ split: test
1051
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1052
+ metrics:
1053
+ - type: cos_sim_pearson
1054
+ value: 83.25944192442188
1055
+ - type: cos_sim_spearman
1056
+ value: 75.04296759426568
1057
+ - type: euclidean_pearson
1058
+ value: 74.8130340249869
1059
+ - type: euclidean_spearman
1060
+ value: 68.40180320816793
1061
+ - type: manhattan_pearson
1062
+ value: 74.9149619199144
1063
+ - type: manhattan_spearman
1064
+ value: 68.52380798258379
1065
+ - task:
1066
+ type: STS
1067
+ dataset:
1068
+ type: mteb/sts12-sts
1069
+ name: MTEB STS12
1070
+ config: default
1071
+ split: test
1072
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1073
+ metrics:
1074
+ - type: cos_sim_pearson
1075
+ value: 81.91983072545858
1076
+ - type: cos_sim_spearman
1077
+ value: 73.5129498787296
1078
+ - type: euclidean_pearson
1079
+ value: 66.76535523270856
1080
+ - type: euclidean_spearman
1081
+ value: 56.64797879544097
1082
+ - type: manhattan_pearson
1083
+ value: 66.12191731384162
1084
+ - type: manhattan_spearman
1085
+ value: 56.37753861965956
1086
+ - task:
1087
+ type: STS
1088
+ dataset:
1089
+ type: mteb/sts13-sts
1090
+ name: MTEB STS13
1091
+ config: default
1092
+ split: test
1093
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1094
+ metrics:
1095
+ - type: cos_sim_pearson
1096
+ value: 77.71164758747632
1097
+ - type: cos_sim_spearman
1098
+ value: 79.1530762030973
1099
+ - type: euclidean_pearson
1100
+ value: 69.50621786400177
1101
+ - type: euclidean_spearman
1102
+ value: 70.44898083428744
1103
+ - type: manhattan_pearson
1104
+ value: 69.04018458995307
1105
+ - type: manhattan_spearman
1106
+ value: 70.00888532086853
1107
+ - task:
1108
+ type: STS
1109
+ dataset:
1110
+ type: mteb/sts14-sts
1111
+ name: MTEB STS14
1112
+ config: default
1113
+ split: test
1114
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1115
+ metrics:
1116
+ - type: cos_sim_pearson
1117
+ value: 78.90774995778577
1118
+ - type: cos_sim_spearman
1119
+ value: 75.24229403562713
1120
+ - type: euclidean_pearson
1121
+ value: 68.5838924571539
1122
+ - type: euclidean_spearman
1123
+ value: 65.06652398167358
1124
+ - type: manhattan_pearson
1125
+ value: 68.23143277902628
1126
+ - type: manhattan_spearman
1127
+ value: 64.79624516012709
1128
+ - task:
1129
+ type: STS
1130
+ dataset:
1131
+ type: mteb/sts15-sts
1132
+ name: MTEB STS15
1133
+ config: default
1134
+ split: test
1135
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1136
+ metrics:
1137
+ - type: cos_sim_pearson
1138
+ value: 83.78074322110155
1139
+ - type: cos_sim_spearman
1140
+ value: 85.12071478276958
1141
+ - type: euclidean_pearson
1142
+ value: 65.00147804089737
1143
+ - type: euclidean_spearman
1144
+ value: 66.02559342831921
1145
+ - type: manhattan_pearson
1146
+ value: 65.01270190203297
1147
+ - type: manhattan_spearman
1148
+ value: 66.13038450207748
1149
+ - task:
1150
+ type: STS
1151
+ dataset:
1152
+ type: mteb/sts16-sts
1153
+ name: MTEB STS16
1154
+ config: default
1155
+ split: test
1156
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1157
+ metrics:
1158
+ - type: cos_sim_pearson
1159
+ value: 77.29395327338185
1160
+ - type: cos_sim_spearman
1161
+ value: 80.07128686563352
1162
+ - type: euclidean_pearson
1163
+ value: 65.97939065455975
1164
+ - type: euclidean_spearman
1165
+ value: 66.80283051081129
1166
+ - type: manhattan_pearson
1167
+ value: 65.6750450606584
1168
+ - type: manhattan_spearman
1169
+ value: 66.55805829330733
1170
+ - task:
1171
+ type: STS
1172
+ dataset:
1173
+ type: mteb/sts17-crosslingual-sts
1174
+ name: MTEB STS17 (en-en)
1175
+ config: en-en
1176
+ split: test
1177
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1178
+ metrics:
1179
+ - type: cos_sim_pearson
1180
+ value: 87.64956503192369
1181
+ - type: cos_sim_spearman
1182
+ value: 87.95719598052727
1183
+ - type: euclidean_pearson
1184
+ value: 73.35178669405819
1185
+ - type: euclidean_spearman
1186
+ value: 71.58959083579994
1187
+ - type: manhattan_pearson
1188
+ value: 73.24156949179472
1189
+ - type: manhattan_spearman
1190
+ value: 71.35933730170666
1191
+ - task:
1192
+ type: STS
1193
+ dataset:
1194
+ type: mteb/sts22-crosslingual-sts
1195
+ name: MTEB STS22 (en)
1196
+ config: en
1197
+ split: test
1198
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
1199
+ metrics:
1200
+ - type: cos_sim_pearson
1201
+ value: 66.61640922485357
1202
+ - type: cos_sim_spearman
1203
+ value: 66.08406266387749
1204
+ - type: euclidean_pearson
1205
+ value: 43.684972836995776
1206
+ - type: euclidean_spearman
1207
+ value: 60.26686390609082
1208
+ - type: manhattan_pearson
1209
+ value: 43.694268683941154
1210
+ - type: manhattan_spearman
1211
+ value: 59.61419719435629
1212
+ - task:
1213
+ type: STS
1214
+ dataset:
1215
+ type: mteb/stsbenchmark-sts
1216
+ name: MTEB STSBenchmark
1217
+ config: default
1218
+ split: test
1219
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
1220
+ metrics:
1221
+ - type: cos_sim_pearson
1222
+ value: 81.73624666044613
1223
+ - type: cos_sim_spearman
1224
+ value: 81.68869881979401
1225
+ - type: euclidean_pearson
1226
+ value: 72.47205990508046
1227
+ - type: euclidean_spearman
1228
+ value: 71.02381428101695
1229
+ - type: manhattan_pearson
1230
+ value: 72.4947870027535
1231
+ - type: manhattan_spearman
1232
+ value: 71.0789806652577
1233
+ - task:
1234
+ type: Reranking
1235
+ dataset:
1236
+ type: mteb/scidocs-reranking
1237
+ name: MTEB SciDocsRR
1238
+ config: default
1239
+ split: test
1240
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
1241
+ metrics:
1242
+ - type: map
1243
+ value: 79.53671929012175
1244
+ - type: mrr
1245
+ value: 93.96566033820936
1246
+ - task:
1247
+ type: Retrieval
1248
+ dataset:
1249
+ type: scifact
1250
+ name: MTEB SciFact
1251
+ config: default
1252
+ split: test
1253
+ revision: None
1254
+ metrics:
1255
+ - type: map_at_1
1256
+ value: 43.761
1257
+ - type: map_at_10
1258
+ value: 53.846000000000004
1259
+ - type: map_at_100
1260
+ value: 54.55799999999999
1261
+ - type: map_at_1000
1262
+ value: 54.620999999999995
1263
+ - type: map_at_3
1264
+ value: 51.513
1265
+ - type: map_at_5
1266
+ value: 52.591
1267
+ - type: mrr_at_1
1268
+ value: 46.666999999999994
1269
+ - type: mrr_at_10
1270
+ value: 55.461000000000006
1271
+ - type: mrr_at_100
1272
+ value: 56.008
1273
+ - type: mrr_at_1000
1274
+ value: 56.069
1275
+ - type: mrr_at_3
1276
+ value: 53.5
1277
+ - type: mrr_at_5
1278
+ value: 54.417
1279
+ - type: ndcg_at_1
1280
+ value: 46.666999999999994
1281
+ - type: ndcg_at_10
1282
+ value: 58.599000000000004
1283
+ - type: ndcg_at_100
1284
+ value: 61.538000000000004
1285
+ - type: ndcg_at_1000
1286
+ value: 63.22
1287
+ - type: ndcg_at_3
1288
+ value: 54.254999999999995
1289
+ - type: ndcg_at_5
1290
+ value: 55.861000000000004
1291
+ - type: precision_at_1
1292
+ value: 46.666999999999994
1293
+ - type: precision_at_10
1294
+ value: 8.033
1295
+ - type: precision_at_100
1296
+ value: 0.963
1297
+ - type: precision_at_1000
1298
+ value: 0.11
1299
+ - type: precision_at_3
1300
+ value: 21.667
1301
+ - type: precision_at_5
1302
+ value: 14.066999999999998
1303
+ - type: recall_at_1
1304
+ value: 43.761
1305
+ - type: recall_at_10
1306
+ value: 71.65599999999999
1307
+ - type: recall_at_100
1308
+ value: 84.433
1309
+ - type: recall_at_1000
1310
+ value: 97.5
1311
+ - type: recall_at_3
1312
+ value: 59.522
1313
+ - type: recall_at_5
1314
+ value: 63.632999999999996
1315
+ - task:
1316
+ type: PairClassification
1317
+ dataset:
1318
+ type: mteb/sprintduplicatequestions-pairclassification
1319
+ name: MTEB SprintDuplicateQuestions
1320
+ config: default
1321
+ split: test
1322
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
1323
+ metrics:
1324
+ - type: cos_sim_accuracy
1325
+ value: 99.68811881188118
1326
+ - type: cos_sim_ap
1327
+ value: 91.08077352794682
1328
+ - type: cos_sim_f1
1329
+ value: 84.38570729319628
1330
+ - type: cos_sim_precision
1331
+ value: 82.64621284755513
1332
+ - type: cos_sim_recall
1333
+ value: 86.2
1334
+ - type: dot_accuracy
1335
+ value: 99.14653465346535
1336
+ - type: dot_ap
1337
+ value: 45.24942149367904
1338
+ - type: dot_f1
1339
+ value: 46.470062555853445
1340
+ - type: dot_precision
1341
+ value: 42.003231017770595
1342
+ - type: dot_recall
1343
+ value: 52.0
1344
+ - type: euclidean_accuracy
1345
+ value: 99.56930693069307
1346
+ - type: euclidean_ap
1347
+ value: 80.28575652582506
1348
+ - type: euclidean_f1
1349
+ value: 75.52054023635341
1350
+ - type: euclidean_precision
1351
+ value: 86.35778635778635
1352
+ - type: euclidean_recall
1353
+ value: 67.10000000000001
1354
+ - type: manhattan_accuracy
1355
+ value: 99.56039603960396
1356
+ - type: manhattan_ap
1357
+ value: 79.74630510301085
1358
+ - type: manhattan_f1
1359
+ value: 74.67569091934575
1360
+ - type: manhattan_precision
1361
+ value: 85.64036222509702
1362
+ - type: manhattan_recall
1363
+ value: 66.2
1364
+ - type: max_accuracy
1365
+ value: 99.68811881188118
1366
+ - type: max_ap
1367
+ value: 91.08077352794682
1368
+ - type: max_f1
1369
+ value: 84.38570729319628
1370
+ - task:
1371
+ type: Clustering
1372
+ dataset:
1373
+ type: mteb/stackexchange-clustering
1374
+ name: MTEB StackExchangeClustering
1375
+ config: default
1376
+ split: test
1377
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
1378
+ metrics:
1379
+ - type: v_measure
1380
+ value: 52.0788049295693
1381
+ - task:
1382
+ type: Clustering
1383
+ dataset:
1384
+ type: mteb/stackexchange-clustering-p2p
1385
+ name: MTEB StackExchangeClusteringP2P
1386
+ config: default
1387
+ split: test
1388
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
1389
+ metrics:
1390
+ - type: v_measure
1391
+ value: 31.606006030205545
1392
+ - task:
1393
+ type: Reranking
1394
+ dataset:
1395
+ type: mteb/stackoverflowdupquestions-reranking
1396
+ name: MTEB StackOverflowDupQuestions
1397
+ config: default
1398
+ split: test
1399
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
1400
+ metrics:
1401
+ - type: map
1402
+ value: 50.87384988372756
1403
+ - type: mrr
1404
+ value: 51.62476922587217
1405
+ - task:
1406
+ type: Summarization
1407
+ dataset:
1408
+ type: mteb/summeval
1409
+ name: MTEB SummEval
1410
+ config: default
1411
+ split: test
1412
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
1413
+ metrics:
1414
+ - type: cos_sim_pearson
1415
+ value: 30.355859978837156
1416
+ - type: cos_sim_spearman
1417
+ value: 30.0847548337847
1418
+ - type: dot_pearson
1419
+ value: 19.391736817587557
1420
+ - type: dot_spearman
1421
+ value: 20.732256259543014
1422
+ - task:
1423
+ type: Retrieval
1424
+ dataset:
1425
+ type: trec-covid
1426
+ name: MTEB TRECCOVID
1427
+ config: default
1428
+ split: test
1429
+ revision: None
1430
+ metrics:
1431
+ - type: map_at_1
1432
+ value: 0.19
1433
+ - type: map_at_10
1434
+ value: 1.2850000000000001
1435
+ - type: map_at_100
1436
+ value: 6.376999999999999
1437
+ - type: map_at_1000
1438
+ value: 15.21
1439
+ - type: map_at_3
1440
+ value: 0.492
1441
+ - type: map_at_5
1442
+ value: 0.776
1443
+ - type: mrr_at_1
1444
+ value: 68.0
1445
+ - type: mrr_at_10
1446
+ value: 79.783
1447
+ - type: mrr_at_100
1448
+ value: 79.783
1449
+ - type: mrr_at_1000
1450
+ value: 79.783
1451
+ - type: mrr_at_3
1452
+ value: 77.333
1453
+ - type: mrr_at_5
1454
+ value: 79.533
1455
+ - type: ndcg_at_1
1456
+ value: 62.0
1457
+ - type: ndcg_at_10
1458
+ value: 54.635
1459
+ - type: ndcg_at_100
1460
+ value: 40.939
1461
+ - type: ndcg_at_1000
1462
+ value: 37.716
1463
+ - type: ndcg_at_3
1464
+ value: 58.531
1465
+ - type: ndcg_at_5
1466
+ value: 58.762
1467
+ - type: precision_at_1
1468
+ value: 68.0
1469
+ - type: precision_at_10
1470
+ value: 58.8
1471
+ - type: precision_at_100
1472
+ value: 41.74
1473
+ - type: precision_at_1000
1474
+ value: 16.938
1475
+ - type: precision_at_3
1476
+ value: 64.0
1477
+ - type: precision_at_5
1478
+ value: 64.8
1479
+ - type: recall_at_1
1480
+ value: 0.19
1481
+ - type: recall_at_10
1482
+ value: 1.547
1483
+ - type: recall_at_100
1484
+ value: 9.739
1485
+ - type: recall_at_1000
1486
+ value: 35.815000000000005
1487
+ - type: recall_at_3
1488
+ value: 0.528
1489
+ - type: recall_at_5
1490
+ value: 0.894
1491
+ - task:
1492
+ type: Retrieval
1493
+ dataset:
1494
+ type: webis-touche2020
1495
+ name: MTEB Touche2020
1496
+ config: default
1497
+ split: test
1498
+ revision: None
1499
+ metrics:
1500
+ - type: map_at_1
1501
+ value: 1.514
1502
+ - type: map_at_10
1503
+ value: 7.163
1504
+ - type: map_at_100
1505
+ value: 11.623999999999999
1506
+ - type: map_at_1000
1507
+ value: 13.062999999999999
1508
+ - type: map_at_3
1509
+ value: 3.51
1510
+ - type: map_at_5
1511
+ value: 4.661
1512
+ - type: mrr_at_1
1513
+ value: 20.408
1514
+ - type: mrr_at_10
1515
+ value: 33.993
1516
+ - type: mrr_at_100
1517
+ value: 35.257
1518
+ - type: mrr_at_1000
1519
+ value: 35.313
1520
+ - type: mrr_at_3
1521
+ value: 30.272
1522
+ - type: mrr_at_5
1523
+ value: 31.701
1524
+ - type: ndcg_at_1
1525
+ value: 18.367
1526
+ - type: ndcg_at_10
1527
+ value: 18.062
1528
+ - type: ndcg_at_100
1529
+ value: 28.441
1530
+ - type: ndcg_at_1000
1531
+ value: 40.748
1532
+ - type: ndcg_at_3
1533
+ value: 18.651999999999997
1534
+ - type: ndcg_at_5
1535
+ value: 17.055
1536
+ - type: precision_at_1
1537
+ value: 20.408
1538
+ - type: precision_at_10
1539
+ value: 17.551
1540
+ - type: precision_at_100
1541
+ value: 6.223999999999999
1542
+ - type: precision_at_1000
1543
+ value: 1.427
1544
+ - type: precision_at_3
1545
+ value: 20.408
1546
+ - type: precision_at_5
1547
+ value: 17.959
1548
+ - type: recall_at_1
1549
+ value: 1.514
1550
+ - type: recall_at_10
1551
+ value: 13.447000000000001
1552
+ - type: recall_at_100
1553
+ value: 39.77
1554
+ - type: recall_at_1000
1555
+ value: 76.95
1556
+ - type: recall_at_3
1557
+ value: 4.806
1558
+ - type: recall_at_5
1559
+ value: 6.873
1560
+ - task:
1561
+ type: Classification
1562
+ dataset:
1563
+ type: mteb/toxic_conversations_50k
1564
+ name: MTEB ToxicConversationsClassification
1565
+ config: default
1566
+ split: test
1567
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
1568
+ metrics:
1569
+ - type: accuracy
1570
+ value: 65.53179999999999
1571
+ - type: ap
1572
+ value: 11.504743595308318
1573
+ - type: f1
1574
+ value: 49.74264614001562
1575
+ - task:
1576
+ type: Classification
1577
+ dataset:
1578
+ type: mteb/tweet_sentiment_extraction
1579
+ name: MTEB TweetSentimentExtractionClassification
1580
+ config: default
1581
+ split: test
1582
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
1583
+ metrics:
1584
+ - type: accuracy
1585
+ value: 56.47425014148275
1586
+ - type: f1
1587
+ value: 56.555750746223346
1588
+ - task:
1589
+ type: Clustering
1590
+ dataset:
1591
+ type: mteb/twentynewsgroups-clustering
1592
+ name: MTEB TwentyNewsgroupsClustering
1593
+ config: default
1594
+ split: test
1595
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
1596
+ metrics:
1597
+ - type: v_measure
1598
+ value: 39.27004599453324
1599
+ - task:
1600
+ type: PairClassification
1601
+ dataset:
1602
+ type: mteb/twittersemeval2015-pairclassification
1603
+ name: MTEB TwitterSemEval2015
1604
+ config: default
1605
+ split: test
1606
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
1607
+ metrics:
1608
+ - type: cos_sim_accuracy
1609
+ value: 84.47875067056088
1610
+ - type: cos_sim_ap
1611
+ value: 68.630858164926
1612
+ - type: cos_sim_f1
1613
+ value: 64.5112402121748
1614
+ - type: cos_sim_precision
1615
+ value: 61.87015503875969
1616
+ - type: cos_sim_recall
1617
+ value: 67.38786279683377
1618
+ - type: dot_accuracy
1619
+ value: 77.68969422423557
1620
+ - type: dot_ap
1621
+ value: 37.28838556128439
1622
+ - type: dot_f1
1623
+ value: 43.27918525376652
1624
+ - type: dot_precision
1625
+ value: 31.776047460140898
1626
+ - type: dot_recall
1627
+ value: 67.83641160949868
1628
+ - type: euclidean_accuracy
1629
+ value: 82.67866722298385
1630
+ - type: euclidean_ap
1631
+ value: 62.72011158877603
1632
+ - type: euclidean_f1
1633
+ value: 60.39579770339605
1634
+ - type: euclidean_precision
1635
+ value: 56.23293903548681
1636
+ - type: euclidean_recall
1637
+ value: 65.22427440633246
1638
+ - type: manhattan_accuracy
1639
+ value: 82.67866722298385
1640
+ - type: manhattan_ap
1641
+ value: 62.80364769571995
1642
+ - type: manhattan_f1
1643
+ value: 60.413827282864574
1644
+ - type: manhattan_precision
1645
+ value: 56.94931090866619
1646
+ - type: manhattan_recall
1647
+ value: 64.32717678100263
1648
+ - type: max_accuracy
1649
+ value: 84.47875067056088
1650
+ - type: max_ap
1651
+ value: 68.630858164926
1652
+ - type: max_f1
1653
+ value: 64.5112402121748
1654
+ - task:
1655
+ type: PairClassification
1656
+ dataset:
1657
+ type: mteb/twitterurlcorpus-pairclassification
1658
+ name: MTEB TwitterURLCorpus
1659
+ config: default
1660
+ split: test
1661
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
1662
+ metrics:
1663
+ - type: cos_sim_accuracy
1664
+ value: 88.4192959987581
1665
+ - type: cos_sim_ap
1666
+ value: 84.81803796578367
1667
+ - type: cos_sim_f1
1668
+ value: 77.1643709825528
1669
+ - type: cos_sim_precision
1670
+ value: 73.77958839643183
1671
+ - type: cos_sim_recall
1672
+ value: 80.874653526332
1673
+ - type: dot_accuracy
1674
+ value: 81.99441145651414
1675
+ - type: dot_ap
1676
+ value: 67.908510950511
1677
+ - type: dot_f1
1678
+ value: 64.4734255193656
1679
+ - type: dot_precision
1680
+ value: 56.120935539075866
1681
+ - type: dot_recall
1682
+ value: 75.74684323991376
1683
+ - type: euclidean_accuracy
1684
+ value: 82.67163426087632
1685
+ - type: euclidean_ap
1686
+ value: 70.1466353903414
1687
+ - type: euclidean_f1
1688
+ value: 62.686024087617795
1689
+ - type: euclidean_precision
1690
+ value: 59.42738875474301
1691
+ - type: euclidean_recall
1692
+ value: 66.32275947028026
1693
+ - type: manhattan_accuracy
1694
+ value: 82.6483486630186
1695
+ - type: manhattan_ap
1696
+ value: 70.12958345267741
1697
+ - type: manhattan_f1
1698
+ value: 62.5966218150587
1699
+ - type: manhattan_precision
1700
+ value: 58.47820272800214
1701
+ - type: manhattan_recall
1702
+ value: 67.33908222975053
1703
+ - type: max_accuracy
1704
+ value: 88.4192959987581
1705
+ - type: max_ap
1706
+ value: 84.81803796578367
1707
+ - type: max_f1
1708
+ value: 77.1643709825528
1709
+ ---
1710
  ---
1711
 
1712
  <br><br>