ClayAtlas commited on
Commit
fb4e159
1 Parent(s): f7806f8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1031 -0
README.md CHANGED
@@ -1,3 +1,1034 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
 
1
+ ---
2
+ tags:
3
+ - mteb
4
+ model-index:
5
+ - name: winberta-large
6
+ results:
7
+ - task:
8
+ type: STS
9
+ dataset:
10
+ type: C-MTEB/AFQMC
11
+ name: MTEB AFQMC
12
+ config: default
13
+ split: validation
14
+ revision: None
15
+ metrics:
16
+ - type: cos_sim_pearson
17
+ value: 41.77725389846057
18
+ - type: cos_sim_spearman
19
+ value: 46.70255351226939
20
+ - type: euclidean_pearson
21
+ value: 45.22550045993912
22
+ - type: euclidean_spearman
23
+ value: 46.70255351226939
24
+ - type: manhattan_pearson
25
+ value: 45.19405644988887
26
+ - type: manhattan_spearman
27
+ value: 46.680519207418264
28
+ - task:
29
+ type: STS
30
+ dataset:
31
+ type: C-MTEB/ATEC
32
+ name: MTEB ATEC
33
+ config: default
34
+ split: test
35
+ revision: None
36
+ metrics:
37
+ - type: cos_sim_pearson
38
+ value: 41.90208621690777
39
+ - type: cos_sim_spearman
40
+ value: 49.95255202729448
41
+ - type: euclidean_pearson
42
+ value: 49.756907552767956
43
+ - type: euclidean_spearman
44
+ value: 49.95255202729448
45
+ - type: manhattan_pearson
46
+ value: 49.75325413164269
47
+ - type: manhattan_spearman
48
+ value: 49.96252496785108
49
+ - task:
50
+ type: Classification
51
+ dataset:
52
+ type: mteb/amazon_reviews_multi
53
+ name: MTEB AmazonReviewsClassification (zh)
54
+ config: zh
55
+ split: test
56
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
57
+ metrics:
58
+ - type: accuracy
59
+ value: 42.038000000000004
60
+ - type: f1
61
+ value: 40.20953065985179
62
+ - task:
63
+ type: STS
64
+ dataset:
65
+ type: C-MTEB/BQ
66
+ name: MTEB BQ
67
+ config: default
68
+ split: test
69
+ revision: None
70
+ metrics:
71
+ - type: cos_sim_pearson
72
+ value: 54.24089984585099
73
+ - type: cos_sim_spearman
74
+ value: 56.075463873104766
75
+ - type: euclidean_pearson
76
+ value: 55.20252472986401
77
+ - type: euclidean_spearman
78
+ value: 56.075463873104766
79
+ - type: manhattan_pearson
80
+ value: 55.13086772848814
81
+ - type: manhattan_spearman
82
+ value: 56.02039158535162
83
+ - task:
84
+ type: Clustering
85
+ dataset:
86
+ type: C-MTEB/CLSClusteringP2P
87
+ name: MTEB CLSClusteringP2P
88
+ config: default
89
+ split: test
90
+ revision: None
91
+ metrics:
92
+ - type: v_measure
93
+ value: 42.83769092800803
94
+ - task:
95
+ type: Clustering
96
+ dataset:
97
+ type: C-MTEB/CLSClusteringS2S
98
+ name: MTEB CLSClusteringS2S
99
+ config: default
100
+ split: test
101
+ revision: None
102
+ metrics:
103
+ - type: v_measure
104
+ value: 39.772368416311195
105
+ - task:
106
+ type: Reranking
107
+ dataset:
108
+ type: C-MTEB/CMedQAv1-reranking
109
+ name: MTEB CMedQAv1
110
+ config: default
111
+ split: test
112
+ revision: None
113
+ metrics:
114
+ - type: map
115
+ value: 78.3895639270477
116
+ - type: mrr
117
+ value: 81.64801587301588
118
+ - task:
119
+ type: Reranking
120
+ dataset:
121
+ type: C-MTEB/CMedQAv2-reranking
122
+ name: MTEB CMedQAv2
123
+ config: default
124
+ split: test
125
+ revision: None
126
+ metrics:
127
+ - type: map
128
+ value: 80.84221923370502
129
+ - type: mrr
130
+ value: 84.32821428571428
131
+ - task:
132
+ type: Retrieval
133
+ dataset:
134
+ type: C-MTEB/CmedqaRetrieval
135
+ name: MTEB CmedqaRetrieval
136
+ config: default
137
+ split: dev
138
+ revision: None
139
+ metrics:
140
+ - type: map_at_1
141
+ value: 18.695999999999998
142
+ - type: map_at_10
143
+ value: 28.171000000000003
144
+ - type: map_at_100
145
+ value: 29.927
146
+ - type: map_at_1000
147
+ value: 30.09
148
+ - type: map_at_3
149
+ value: 24.854000000000003
150
+ - type: map_at_5
151
+ value: 26.573
152
+ - type: mrr_at_1
153
+ value: 29.256999999999998
154
+ - type: mrr_at_10
155
+ value: 36.584
156
+ - type: mrr_at_100
157
+ value: 37.643
158
+ - type: mrr_at_1000
159
+ value: 37.713
160
+ - type: mrr_at_3
161
+ value: 34.171
162
+ - type: mrr_at_5
163
+ value: 35.436
164
+ - type: ndcg_at_1
165
+ value: 29.256999999999998
166
+ - type: ndcg_at_10
167
+ value: 34.079
168
+ - type: ndcg_at_100
169
+ value: 41.538000000000004
170
+ - type: ndcg_at_1000
171
+ value: 44.651999999999994
172
+ - type: ndcg_at_3
173
+ value: 29.439999999999998
174
+ - type: ndcg_at_5
175
+ value: 31.172
176
+ - type: precision_at_1
177
+ value: 29.256999999999998
178
+ - type: precision_at_10
179
+ value: 7.804
180
+ - type: precision_at_100
181
+ value: 1.392
182
+ - type: precision_at_1000
183
+ value: 0.179
184
+ - type: precision_at_3
185
+ value: 16.804
186
+ - type: precision_at_5
187
+ value: 12.267999999999999
188
+ - type: recall_at_1
189
+ value: 18.695999999999998
190
+ - type: recall_at_10
191
+ value: 43.325
192
+ - type: recall_at_100
193
+ value: 74.765
194
+ - type: recall_at_1000
195
+ value: 95.999
196
+ - type: recall_at_3
197
+ value: 29.384
198
+ - type: recall_at_5
199
+ value: 34.765
200
+ - task:
201
+ type: PairClassification
202
+ dataset:
203
+ type: C-MTEB/CMNLI
204
+ name: MTEB Cmnli
205
+ config: default
206
+ split: validation
207
+ revision: None
208
+ metrics:
209
+ - type: cos_sim_accuracy
210
+ value: 79.15814792543597
211
+ - type: cos_sim_ap
212
+ value: 87.29838623651833
213
+ - type: cos_sim_f1
214
+ value: 80.6512349097353
215
+ - type: cos_sim_precision
216
+ value: 76.62037037037037
217
+ - type: cos_sim_recall
218
+ value: 85.1297638531681
219
+ - type: dot_accuracy
220
+ value: 79.15814792543597
221
+ - type: dot_ap
222
+ value: 87.30641807786448
223
+ - type: dot_f1
224
+ value: 80.6512349097353
225
+ - type: dot_precision
226
+ value: 76.62037037037037
227
+ - type: dot_recall
228
+ value: 85.1297638531681
229
+ - type: euclidean_accuracy
230
+ value: 79.15814792543597
231
+ - type: euclidean_ap
232
+ value: 87.29838623651833
233
+ - type: euclidean_f1
234
+ value: 80.6512349097353
235
+ - type: euclidean_precision
236
+ value: 76.62037037037037
237
+ - type: euclidean_recall
238
+ value: 85.1297638531681
239
+ - type: manhattan_accuracy
240
+ value: 79.15814792543597
241
+ - type: manhattan_ap
242
+ value: 87.29705330875109
243
+ - type: manhattan_f1
244
+ value: 80.66914498141264
245
+ - type: manhattan_precision
246
+ value: 75.76504415691106
247
+ - type: manhattan_recall
248
+ value: 86.2520458265139
249
+ - type: max_accuracy
250
+ value: 79.15814792543597
251
+ - type: max_ap
252
+ value: 87.30641807786448
253
+ - type: max_f1
254
+ value: 80.66914498141264
255
+ - task:
256
+ type: Retrieval
257
+ dataset:
258
+ type: C-MTEB/CovidRetrieval
259
+ name: MTEB CovidRetrieval
260
+ config: default
261
+ split: dev
262
+ revision: None
263
+ metrics:
264
+ - type: map_at_1
265
+ value: 58.325
266
+ - type: map_at_10
267
+ value: 67.572
268
+ - type: map_at_100
269
+ value: 68.142
270
+ - type: map_at_1000
271
+ value: 68.152
272
+ - type: map_at_3
273
+ value: 65.446
274
+ - type: map_at_5
275
+ value: 66.794
276
+ - type: mrr_at_1
277
+ value: 58.272
278
+ - type: mrr_at_10
279
+ value: 67.469
280
+ - type: mrr_at_100
281
+ value: 68.048
282
+ - type: mrr_at_1000
283
+ value: 68.05799999999999
284
+ - type: mrr_at_3
285
+ value: 65.385
286
+ - type: mrr_at_5
287
+ value: 66.728
288
+ - type: ndcg_at_1
289
+ value: 58.377
290
+ - type: ndcg_at_10
291
+ value: 71.922
292
+ - type: ndcg_at_100
293
+ value: 74.49799999999999
294
+ - type: ndcg_at_1000
295
+ value: 74.80799999999999
296
+ - type: ndcg_at_3
297
+ value: 67.711
298
+ - type: ndcg_at_5
299
+ value: 70.075
300
+ - type: precision_at_1
301
+ value: 58.377
302
+ - type: precision_at_10
303
+ value: 8.641
304
+ - type: precision_at_100
305
+ value: 0.9809999999999999
306
+ - type: precision_at_1000
307
+ value: 0.101
308
+ - type: precision_at_3
309
+ value: 24.833
310
+ - type: precision_at_5
311
+ value: 16.101
312
+ - type: recall_at_1
313
+ value: 58.325
314
+ - type: recall_at_10
315
+ value: 85.458
316
+ - type: recall_at_100
317
+ value: 97.05
318
+ - type: recall_at_1000
319
+ value: 99.579
320
+ - type: recall_at_3
321
+ value: 74.18299999999999
322
+ - type: recall_at_5
323
+ value: 79.768
324
+ - task:
325
+ type: Retrieval
326
+ dataset:
327
+ type: C-MTEB/DuRetrieval
328
+ name: MTEB DuRetrieval
329
+ config: default
330
+ split: dev
331
+ revision: None
332
+ metrics:
333
+ - type: map_at_1
334
+ value: 23.448
335
+ - type: map_at_10
336
+ value: 70.368
337
+ - type: map_at_100
338
+ value: 73.644
339
+ - type: map_at_1000
340
+ value: 73.727
341
+ - type: map_at_3
342
+ value: 48.317
343
+ - type: map_at_5
344
+ value: 61.114999999999995
345
+ - type: mrr_at_1
346
+ value: 83.5
347
+ - type: mrr_at_10
348
+ value: 88.592
349
+ - type: mrr_at_100
350
+ value: 88.69200000000001
351
+ - type: mrr_at_1000
352
+ value: 88.696
353
+ - type: mrr_at_3
354
+ value: 88.058
355
+ - type: mrr_at_5
356
+ value: 88.458
357
+ - type: ndcg_at_1
358
+ value: 83.5
359
+ - type: ndcg_at_10
360
+ value: 79.696
361
+ - type: ndcg_at_100
362
+ value: 83.88799999999999
363
+ - type: ndcg_at_1000
364
+ value: 84.64699999999999
365
+ - type: ndcg_at_3
366
+ value: 78.39500000000001
367
+ - type: ndcg_at_5
368
+ value: 77.289
369
+ - type: precision_at_1
370
+ value: 83.5
371
+ - type: precision_at_10
372
+ value: 38.525
373
+ - type: precision_at_100
374
+ value: 4.656
375
+ - type: precision_at_1000
376
+ value: 0.485
377
+ - type: precision_at_3
378
+ value: 70.383
379
+ - type: precision_at_5
380
+ value: 59.56
381
+ - type: recall_at_1
382
+ value: 23.448
383
+ - type: recall_at_10
384
+ value: 81.274
385
+ - type: recall_at_100
386
+ value: 94.447
387
+ - type: recall_at_1000
388
+ value: 98.209
389
+ - type: recall_at_3
390
+ value: 51.122
391
+ - type: recall_at_5
392
+ value: 67.29899999999999
393
+ - task:
394
+ type: Retrieval
395
+ dataset:
396
+ type: C-MTEB/EcomRetrieval
397
+ name: MTEB EcomRetrieval
398
+ config: default
399
+ split: dev
400
+ revision: None
401
+ metrics:
402
+ - type: map_at_1
403
+ value: 44.2
404
+ - type: map_at_10
405
+ value: 54.083999999999996
406
+ - type: map_at_100
407
+ value: 54.775
408
+ - type: map_at_1000
409
+ value: 54.800000000000004
410
+ - type: map_at_3
411
+ value: 51.5
412
+ - type: map_at_5
413
+ value: 52.94
414
+ - type: mrr_at_1
415
+ value: 44.2
416
+ - type: mrr_at_10
417
+ value: 54.083999999999996
418
+ - type: mrr_at_100
419
+ value: 54.775
420
+ - type: mrr_at_1000
421
+ value: 54.800000000000004
422
+ - type: mrr_at_3
423
+ value: 51.5
424
+ - type: mrr_at_5
425
+ value: 52.94
426
+ - type: ndcg_at_1
427
+ value: 44.2
428
+ - type: ndcg_at_10
429
+ value: 59.221999999999994
430
+ - type: ndcg_at_100
431
+ value: 62.463
432
+ - type: ndcg_at_1000
433
+ value: 63.159
434
+ - type: ndcg_at_3
435
+ value: 53.888000000000005
436
+ - type: ndcg_at_5
437
+ value: 56.483000000000004
438
+ - type: precision_at_1
439
+ value: 44.2
440
+ - type: precision_at_10
441
+ value: 7.55
442
+ - type: precision_at_100
443
+ value: 0.9039999999999999
444
+ - type: precision_at_1000
445
+ value: 0.096
446
+ - type: precision_at_3
447
+ value: 20.267
448
+ - type: precision_at_5
449
+ value: 13.420000000000002
450
+ - type: recall_at_1
451
+ value: 44.2
452
+ - type: recall_at_10
453
+ value: 75.5
454
+ - type: recall_at_100
455
+ value: 90.4
456
+ - type: recall_at_1000
457
+ value: 95.89999999999999
458
+ - type: recall_at_3
459
+ value: 60.8
460
+ - type: recall_at_5
461
+ value: 67.10000000000001
462
+ - task:
463
+ type: Classification
464
+ dataset:
465
+ type: C-MTEB/IFlyTek-classification
466
+ name: MTEB IFlyTek
467
+ config: default
468
+ split: validation
469
+ revision: None
470
+ metrics:
471
+ - type: accuracy
472
+ value: 46.30242400923432
473
+ - type: f1
474
+ value: 34.9084495621858
475
+ - task:
476
+ type: Classification
477
+ dataset:
478
+ type: C-MTEB/JDReview-classification
479
+ name: MTEB JDReview
480
+ config: default
481
+ split: test
482
+ revision: None
483
+ metrics:
484
+ - type: accuracy
485
+ value: 77.2983114446529
486
+ - type: ap
487
+ value: 38.88426285856333
488
+ - type: f1
489
+ value: 70.55729261942591
490
+ - task:
491
+ type: STS
492
+ dataset:
493
+ type: C-MTEB/LCQMC
494
+ name: MTEB LCQMC
495
+ config: default
496
+ split: test
497
+ revision: None
498
+ metrics:
499
+ - type: cos_sim_pearson
500
+ value: 68.5643564120875
501
+ - type: cos_sim_spearman
502
+ value: 74.96268256412532
503
+ - type: euclidean_pearson
504
+ value: 74.05621406127399
505
+ - type: euclidean_spearman
506
+ value: 74.96268256412532
507
+ - type: manhattan_pearson
508
+ value: 74.04916252136826
509
+ - type: manhattan_spearman
510
+ value: 74.95628866390487
511
+ - task:
512
+ type: Reranking
513
+ dataset:
514
+ type: C-MTEB/Mmarco-reranking
515
+ name: MTEB MMarcoReranking
516
+ config: default
517
+ split: dev
518
+ revision: None
519
+ metrics:
520
+ - type: map
521
+ value: 27.289171935571773
522
+ - type: mrr
523
+ value: 25.7218253968254
524
+ - task:
525
+ type: Retrieval
526
+ dataset:
527
+ type: C-MTEB/MMarcoRetrieval
528
+ name: MTEB MMarcoRetrieval
529
+ config: default
530
+ split: dev
531
+ revision: None
532
+ metrics:
533
+ - type: map_at_1
534
+ value: 61.632
535
+ - type: map_at_10
536
+ value: 70.796
537
+ - type: map_at_100
538
+ value: 71.21300000000001
539
+ - type: map_at_1000
540
+ value: 71.22800000000001
541
+ - type: map_at_3
542
+ value: 68.848
543
+ - type: map_at_5
544
+ value: 70.044
545
+ - type: mrr_at_1
546
+ value: 63.768
547
+ - type: mrr_at_10
548
+ value: 71.516
549
+ - type: mrr_at_100
550
+ value: 71.884
551
+ - type: mrr_at_1000
552
+ value: 71.897
553
+ - type: mrr_at_3
554
+ value: 69.814
555
+ - type: mrr_at_5
556
+ value: 70.843
557
+ - type: ndcg_at_1
558
+ value: 63.768
559
+ - type: ndcg_at_10
560
+ value: 74.727
561
+ - type: ndcg_at_100
562
+ value: 76.649
563
+ - type: ndcg_at_1000
564
+ value: 77.05300000000001
565
+ - type: ndcg_at_3
566
+ value: 71.00800000000001
567
+ - type: ndcg_at_5
568
+ value: 73.015
569
+ - type: precision_at_1
570
+ value: 63.768
571
+ - type: precision_at_10
572
+ value: 9.15
573
+ - type: precision_at_100
574
+ value: 1.012
575
+ - type: precision_at_1000
576
+ value: 0.105
577
+ - type: precision_at_3
578
+ value: 26.848
579
+ - type: precision_at_5
580
+ value: 17.172
581
+ - type: recall_at_1
582
+ value: 61.632
583
+ - type: recall_at_10
584
+ value: 86.162
585
+ - type: recall_at_100
586
+ value: 94.953
587
+ - type: recall_at_1000
588
+ value: 98.148
589
+ - type: recall_at_3
590
+ value: 76.287
591
+ - type: recall_at_5
592
+ value: 81.03399999999999
593
+ - task:
594
+ type: Retrieval
595
+ dataset:
596
+ type: C-MTEB/MedicalRetrieval
597
+ name: MTEB MedicalRetrieval
598
+ config: default
599
+ split: dev
600
+ revision: None
601
+ metrics:
602
+ - type: map_at_1
603
+ value: 43.2
604
+ - type: map_at_10
605
+ value: 48.788
606
+ - type: map_at_100
607
+ value: 49.412
608
+ - type: map_at_1000
609
+ value: 49.480000000000004
610
+ - type: map_at_3
611
+ value: 47.55
612
+ - type: map_at_5
613
+ value: 48.27
614
+ - type: mrr_at_1
615
+ value: 43.2
616
+ - type: mrr_at_10
617
+ value: 48.788
618
+ - type: mrr_at_100
619
+ value: 49.412
620
+ - type: mrr_at_1000
621
+ value: 49.480000000000004
622
+ - type: mrr_at_3
623
+ value: 47.55
624
+ - type: mrr_at_5
625
+ value: 48.27
626
+ - type: ndcg_at_1
627
+ value: 43.2
628
+ - type: ndcg_at_10
629
+ value: 51.504000000000005
630
+ - type: ndcg_at_100
631
+ value: 54.718
632
+ - type: ndcg_at_1000
633
+ value: 56.754000000000005
634
+ - type: ndcg_at_3
635
+ value: 48.975
636
+ - type: ndcg_at_5
637
+ value: 50.283
638
+ - type: precision_at_1
639
+ value: 43.2
640
+ - type: precision_at_10
641
+ value: 6.0
642
+ - type: precision_at_100
643
+ value: 0.755
644
+ - type: precision_at_1000
645
+ value: 0.092
646
+ - type: precision_at_3
647
+ value: 17.7
648
+ - type: precision_at_5
649
+ value: 11.26
650
+ - type: recall_at_1
651
+ value: 43.2
652
+ - type: recall_at_10
653
+ value: 60.0
654
+ - type: recall_at_100
655
+ value: 75.5
656
+ - type: recall_at_1000
657
+ value: 92.0
658
+ - type: recall_at_3
659
+ value: 53.1
660
+ - type: recall_at_5
661
+ value: 56.3
662
+ - task:
663
+ type: Classification
664
+ dataset:
665
+ type: C-MTEB/MultilingualSentiment-classification
666
+ name: MTEB MultilingualSentiment
667
+ config: default
668
+ split: validation
669
+ revision: None
670
+ metrics:
671
+ - type: accuracy
672
+ value: 71.66666666666669
673
+ - type: f1
674
+ value: 71.30679309756734
675
+ - task:
676
+ type: PairClassification
677
+ dataset:
678
+ type: C-MTEB/OCNLI
679
+ name: MTEB Ocnli
680
+ config: default
681
+ split: validation
682
+ revision: None
683
+ metrics:
684
+ - type: cos_sim_accuracy
685
+ value: 73.47049269085004
686
+ - type: cos_sim_ap
687
+ value: 77.45627413542758
688
+ - type: cos_sim_f1
689
+ value: 76.38326585695006
690
+ - type: cos_sim_precision
691
+ value: 66.53605015673982
692
+ - type: cos_sim_recall
693
+ value: 89.65153115100317
694
+ - type: dot_accuracy
695
+ value: 73.47049269085004
696
+ - type: dot_ap
697
+ value: 77.45627413542758
698
+ - type: dot_f1
699
+ value: 76.38326585695006
700
+ - type: dot_precision
701
+ value: 66.53605015673982
702
+ - type: dot_recall
703
+ value: 89.65153115100317
704
+ - type: euclidean_accuracy
705
+ value: 73.47049269085004
706
+ - type: euclidean_ap
707
+ value: 77.45620654340667
708
+ - type: euclidean_f1
709
+ value: 76.38326585695006
710
+ - type: euclidean_precision
711
+ value: 66.53605015673982
712
+ - type: euclidean_recall
713
+ value: 89.65153115100317
714
+ - type: manhattan_accuracy
715
+ value: 73.36220898754738
716
+ - type: manhattan_ap
717
+ value: 77.37536169412738
718
+ - type: manhattan_f1
719
+ value: 76.38640429338103
720
+ - type: manhattan_precision
721
+ value: 66.25290923196276
722
+ - type: manhattan_recall
723
+ value: 90.17951425554382
724
+ - type: max_accuracy
725
+ value: 73.47049269085004
726
+ - type: max_ap
727
+ value: 77.45627413542758
728
+ - type: max_f1
729
+ value: 76.38640429338103
730
+ - task:
731
+ type: Classification
732
+ dataset:
733
+ type: C-MTEB/OnlineShopping-classification
734
+ name: MTEB OnlineShopping
735
+ config: default
736
+ split: test
737
+ revision: None
738
+ metrics:
739
+ - type: accuracy
740
+ value: 91.53
741
+ - type: ap
742
+ value: 89.42581459526625
743
+ - type: f1
744
+ value: 91.52129393166419
745
+ - task:
746
+ type: STS
747
+ dataset:
748
+ type: C-MTEB/PAWSX
749
+ name: MTEB PAWSX
750
+ config: default
751
+ split: test
752
+ revision: None
753
+ metrics:
754
+ - type: cos_sim_pearson
755
+ value: 29.140746638094146
756
+ - type: cos_sim_spearman
757
+ value: 33.485405306894954
758
+ - type: euclidean_pearson
759
+ value: 33.519345307695055
760
+ - type: euclidean_spearman
761
+ value: 33.485405306894954
762
+ - type: manhattan_pearson
763
+ value: 33.477525315080555
764
+ - type: manhattan_spearman
765
+ value: 33.45108970796106
766
+ - task:
767
+ type: STS
768
+ dataset:
769
+ type: C-MTEB/QBQTC
770
+ name: MTEB QBQTC
771
+ config: default
772
+ split: test
773
+ revision: None
774
+ metrics:
775
+ - type: cos_sim_pearson
776
+ value: 29.1489803117667
777
+ - type: cos_sim_spearman
778
+ value: 31.064278185902484
779
+ - type: euclidean_pearson
780
+ value: 29.46668604738617
781
+ - type: euclidean_spearman
782
+ value: 31.064327209275294
783
+ - type: manhattan_pearson
784
+ value: 29.486028367555363
785
+ - type: manhattan_spearman
786
+ value: 31.08380235579532
787
+ - task:
788
+ type: STS
789
+ dataset:
790
+ type: mteb/sts22-crosslingual-sts
791
+ name: MTEB STS22 (zh)
792
+ config: zh
793
+ split: test
794
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
795
+ metrics:
796
+ - type: cos_sim_pearson
797
+ value: 62.11437173048506
798
+ - type: cos_sim_spearman
799
+ value: 64.51063977663124
800
+ - type: euclidean_pearson
801
+ value: 63.21313519423639
802
+ - type: euclidean_spearman
803
+ value: 64.51063977663124
804
+ - type: manhattan_pearson
805
+ value: 66.21953089701206
806
+ - type: manhattan_spearman
807
+ value: 66.39662588897919
808
+ - task:
809
+ type: STS
810
+ dataset:
811
+ type: C-MTEB/STSB
812
+ name: MTEB STSB
813
+ config: default
814
+ split: test
815
+ revision: None
816
+ metrics:
817
+ - type: cos_sim_pearson
818
+ value: 78.98157503278959
819
+ - type: cos_sim_spearman
820
+ value: 79.62582795918624
821
+ - type: euclidean_pearson
822
+ value: 79.44521376122044
823
+ - type: euclidean_spearman
824
+ value: 79.62582795918624
825
+ - type: manhattan_pearson
826
+ value: 79.4254734731864
827
+ - type: manhattan_spearman
828
+ value: 79.61078135348473
829
+ - task:
830
+ type: Reranking
831
+ dataset:
832
+ type: C-MTEB/T2Reranking
833
+ name: MTEB T2Reranking
834
+ config: default
835
+ split: dev
836
+ revision: None
837
+ metrics:
838
+ - type: map
839
+ value: 66.29923663749156
840
+ - type: mrr
841
+ value: 76.31176720293172
842
+ - task:
843
+ type: Retrieval
844
+ dataset:
845
+ type: C-MTEB/T2Retrieval
846
+ name: MTEB T2Retrieval
847
+ config: default
848
+ split: dev
849
+ revision: None
850
+ metrics:
851
+ - type: map_at_1
852
+ value: 24.518
853
+ - type: map_at_10
854
+ value: 67.938
855
+ - type: map_at_100
856
+ value: 71.769
857
+ - type: map_at_1000
858
+ value: 71.882
859
+ - type: map_at_3
860
+ value: 47.884
861
+ - type: map_at_5
862
+ value: 58.733000000000004
863
+ - type: mrr_at_1
864
+ value: 84.328
865
+ - type: mrr_at_10
866
+ value: 87.96000000000001
867
+ - type: mrr_at_100
868
+ value: 88.114
869
+ - type: mrr_at_1000
870
+ value: 88.12
871
+ - type: mrr_at_3
872
+ value: 87.306
873
+ - type: mrr_at_5
874
+ value: 87.734
875
+ - type: ndcg_at_1
876
+ value: 84.328
877
+ - type: ndcg_at_10
878
+ value: 77.077
879
+ - type: ndcg_at_100
880
+ value: 81.839
881
+ - type: ndcg_at_1000
882
+ value: 82.974
883
+ - type: ndcg_at_3
884
+ value: 79.209
885
+ - type: ndcg_at_5
886
+ value: 77.345
887
+ - type: precision_at_1
888
+ value: 84.328
889
+ - type: precision_at_10
890
+ value: 38.596000000000004
891
+ - type: precision_at_100
892
+ value: 4.825
893
+ - type: precision_at_1000
894
+ value: 0.51
895
+ - type: precision_at_3
896
+ value: 69.547
897
+ - type: precision_at_5
898
+ value: 58.033
899
+ - type: recall_at_1
900
+ value: 24.518
901
+ - type: recall_at_10
902
+ value: 75.982
903
+ - type: recall_at_100
904
+ value: 91.40899999999999
905
+ - type: recall_at_1000
906
+ value: 97.129
907
+ - type: recall_at_3
908
+ value: 50.014
909
+ - type: recall_at_5
910
+ value: 62.971
911
+ - task:
912
+ type: Classification
913
+ dataset:
914
+ type: C-MTEB/TNews-classification
915
+ name: MTEB TNews
916
+ config: default
917
+ split: validation
918
+ revision: None
919
+ metrics:
920
+ - type: accuracy
921
+ value: 50.17400000000001
922
+ - type: f1
923
+ value: 48.49778139007515
924
+ - task:
925
+ type: Clustering
926
+ dataset:
927
+ type: C-MTEB/ThuNewsClusteringP2P
928
+ name: MTEB ThuNewsClusteringP2P
929
+ config: default
930
+ split: test
931
+ revision: None
932
+ metrics:
933
+ - type: v_measure
934
+ value: 58.925265567508944
935
+ - task:
936
+ type: Clustering
937
+ dataset:
938
+ type: C-MTEB/ThuNewsClusteringS2S
939
+ name: MTEB ThuNewsClusteringS2S
940
+ config: default
941
+ split: test
942
+ revision: None
943
+ metrics:
944
+ - type: v_measure
945
+ value: 53.70728044857883
946
+ - task:
947
+ type: Retrieval
948
+ dataset:
949
+ type: C-MTEB/VideoRetrieval
950
+ name: MTEB VideoRetrieval
951
+ config: default
952
+ split: dev
953
+ revision: None
954
+ metrics:
955
+ - type: map_at_1
956
+ value: 49.5
957
+ - type: map_at_10
958
+ value: 59.772000000000006
959
+ - type: map_at_100
960
+ value: 60.312
961
+ - type: map_at_1000
962
+ value: 60.333000000000006
963
+ - type: map_at_3
964
+ value: 57.367000000000004
965
+ - type: map_at_5
966
+ value: 58.797
967
+ - type: mrr_at_1
968
+ value: 49.5
969
+ - type: mrr_at_10
970
+ value: 59.772000000000006
971
+ - type: mrr_at_100
972
+ value: 60.312
973
+ - type: mrr_at_1000
974
+ value: 60.333000000000006
975
+ - type: mrr_at_3
976
+ value: 57.367000000000004
977
+ - type: mrr_at_5
978
+ value: 58.797
979
+ - type: ndcg_at_1
980
+ value: 49.5
981
+ - type: ndcg_at_10
982
+ value: 64.672
983
+ - type: ndcg_at_100
984
+ value: 67.389
985
+ - type: ndcg_at_1000
986
+ value: 67.984
987
+ - type: ndcg_at_3
988
+ value: 59.8
989
+ - type: ndcg_at_5
990
+ value: 62.385999999999996
991
+ - type: precision_at_1
992
+ value: 49.5
993
+ - type: precision_at_10
994
+ value: 8.0
995
+ - type: precision_at_100
996
+ value: 0.9289999999999999
997
+ - type: precision_at_1000
998
+ value: 0.098
999
+ - type: precision_at_3
1000
+ value: 22.267
1001
+ - type: precision_at_5
1002
+ value: 14.62
1003
+ - type: recall_at_1
1004
+ value: 49.5
1005
+ - type: recall_at_10
1006
+ value: 80.0
1007
+ - type: recall_at_100
1008
+ value: 92.9
1009
+ - type: recall_at_1000
1010
+ value: 97.7
1011
+ - type: recall_at_3
1012
+ value: 66.8
1013
+ - type: recall_at_5
1014
+ value: 73.1
1015
+ - task:
1016
+ type: Classification
1017
+ dataset:
1018
+ type: C-MTEB/waimai-classification
1019
+ name: MTEB Waimai
1020
+ config: default
1021
+ split: test
1022
+ revision: None
1023
+ metrics:
1024
+ - type: accuracy
1025
+ value: 85.97999999999999
1026
+ - type: ap
1027
+ value: 68.63874013611306
1028
+ - type: f1
1029
+ value: 84.22025909308913
1030
+ ---
1031
+
1032
  ---
1033
  license: apache-2.0
1034
  ---