thenlper commited on
Commit
3beb4a4
1 Parent(s): 68a7fd2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3316 -3
README.md CHANGED
@@ -1,3 +1,3316 @@
1
- ---
2
- license: apache-2.0
3
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - mteb
4
+ - sentence-transformers
5
+ - transformers
6
+ - Qwen2
7
+ - sentence-similarity
8
+ license: apache-2.0
9
+ model-index:
10
+ - name: gte-qwen2-7B-instruct
11
+ results:
12
+ - task:
13
+ type: Classification
14
+ dataset:
15
+ type: mteb/amazon_counterfactual
16
+ name: MTEB AmazonCounterfactualClassification (en)
17
+ config: en
18
+ split: test
19
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
20
+ metrics:
21
+ - type: accuracy
22
+ value: 88.01492537313432
23
+ - type: ap
24
+ value: 59.096217055359276
25
+ - type: f1
26
+ value: 83.2699173062069
27
+ - task:
28
+ type: Classification
29
+ dataset:
30
+ type: mteb/amazon_polarity
31
+ name: MTEB AmazonPolarityClassification
32
+ config: default
33
+ split: test
34
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
35
+ metrics:
36
+ - type: accuracy
37
+ value: 97.29805
38
+ - type: ap
39
+ value: 95.97973142381882
40
+ - type: f1
41
+ value: 97.29773206176378
42
+ - task:
43
+ type: Classification
44
+ dataset:
45
+ type: mteb/amazon_reviews_multi
46
+ name: MTEB AmazonReviewsClassification (en)
47
+ config: en
48
+ split: test
49
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
50
+ metrics:
51
+ - type: accuracy
52
+ value: 62.798
53
+ - type: f1
54
+ value: 61.33195375425034
55
+ - task:
56
+ type: Retrieval
57
+ dataset:
58
+ type: mteb/arguana
59
+ name: MTEB ArguAna
60
+ config: default
61
+ split: test
62
+ revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
63
+ metrics:
64
+ - type: map_at_1
65
+ value: 36.629
66
+ - type: map_at_10
67
+ value: 54.982
68
+ - type: map_at_100
69
+ value: 55.355
70
+ - type: map_at_1000
71
+ value: 55.355
72
+ - type: map_at_3
73
+ value: 50.036
74
+ - type: map_at_5
75
+ value: 53.25
76
+ - type: mrr_at_1
77
+ value: 37.624
78
+ - type: mrr_at_10
79
+ value: 55.376000000000005
80
+ - type: mrr_at_100
81
+ value: 55.749
82
+ - type: mrr_at_1000
83
+ value: 55.749
84
+ - type: mrr_at_3
85
+ value: 50.461999999999996
86
+ - type: mrr_at_5
87
+ value: 53.644999999999996
88
+ - type: ndcg_at_1
89
+ value: 36.629
90
+ - type: ndcg_at_10
91
+ value: 64.35499999999999
92
+ - type: ndcg_at_100
93
+ value: 65.778
94
+ - type: ndcg_at_1000
95
+ value: 65.778
96
+ - type: ndcg_at_3
97
+ value: 54.478
98
+ - type: ndcg_at_5
99
+ value: 60.260000000000005
100
+ - type: precision_at_1
101
+ value: 36.629
102
+ - type: precision_at_10
103
+ value: 9.381
104
+ - type: precision_at_100
105
+ value: 0.996
106
+ - type: precision_at_1000
107
+ value: 0.1
108
+ - type: precision_at_3
109
+ value: 22.451
110
+ - type: precision_at_5
111
+ value: 16.273
112
+ - type: recall_at_1
113
+ value: 36.629
114
+ - type: recall_at_10
115
+ value: 93.812
116
+ - type: recall_at_100
117
+ value: 99.644
118
+ - type: recall_at_1000
119
+ value: 99.644
120
+ - type: recall_at_3
121
+ value: 67.354
122
+ - type: recall_at_5
123
+ value: 81.366
124
+ - task:
125
+ type: Clustering
126
+ dataset:
127
+ type: mteb/arxiv-clustering-p2p
128
+ name: MTEB ArxivClusteringP2P
129
+ config: default
130
+ split: test
131
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
132
+ metrics:
133
+ - type: v_measure
134
+ value: 56.30960182540703
135
+ - task:
136
+ type: Clustering
137
+ dataset:
138
+ type: mteb/arxiv-clustering-s2s
139
+ name: MTEB ArxivClusteringS2S
140
+ config: default
141
+ split: test
142
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
143
+ metrics:
144
+ - type: v_measure
145
+ value: 51.858431775176975
146
+ - task:
147
+ type: Reranking
148
+ dataset:
149
+ type: mteb/askubuntudupquestions-reranking
150
+ name: MTEB AskUbuntuDupQuestions
151
+ config: default
152
+ split: test
153
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
154
+ metrics:
155
+ - type: map
156
+ value: 67.5678414928039
157
+ - type: mrr
158
+ value: 79.56305236776153
159
+ - task:
160
+ type: STS
161
+ dataset:
162
+ type: mteb/biosses-sts
163
+ name: MTEB BIOSSES
164
+ config: default
165
+ split: test
166
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
167
+ metrics:
168
+ - type: cos_sim_pearson
169
+ value: 82.32511136457549
170
+ - type: cos_sim_spearman
171
+ value: 79.34518142776068
172
+ - type: euclidean_pearson
173
+ value: 81.09762569927126
174
+ - type: euclidean_spearman
175
+ value: 79.33554265391781
176
+ - type: manhattan_pearson
177
+ value: 81.33942162521643
178
+ - type: manhattan_spearman
179
+ value: 79.91206181439438
180
+ - task:
181
+ type: Classification
182
+ dataset:
183
+ type: mteb/banking77
184
+ name: MTEB Banking77Classification
185
+ config: default
186
+ split: test
187
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
188
+ metrics:
189
+ - type: accuracy
190
+ value: 85.99675324675324
191
+ - type: f1
192
+ value: 85.5564660877528
193
+ - task:
194
+ type: Clustering
195
+ dataset:
196
+ type: mteb/biorxiv-clustering-p2p
197
+ name: MTEB BiorxivClusteringP2P
198
+ config: default
199
+ split: test
200
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
201
+ metrics:
202
+ - type: v_measure
203
+ value: 50.413005916654384
204
+ - task:
205
+ type: Clustering
206
+ dataset:
207
+ type: mteb/biorxiv-clustering-s2s
208
+ name: MTEB BiorxivClusteringS2S
209
+ config: default
210
+ split: test
211
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
212
+ metrics:
213
+ - type: v_measure
214
+ value: 46.58170679922341
215
+ - task:
216
+ type: Retrieval
217
+ dataset:
218
+ type: BeIR/cqadupstack
219
+ name: MTEB CQADupstackAndroidRetrieval
220
+ config: default
221
+ split: test
222
+ revision: f46a197baaae43b4f621051089b82a364682dfeb
223
+ metrics:
224
+ - type: map_at_1
225
+ value: 34.588
226
+ - type: map_at_10
227
+ value: 47.851
228
+ - type: map_at_100
229
+ value: 49.484
230
+ - type: map_at_1000
231
+ value: 49.6
232
+ - type: map_at_3
233
+ value: 43.34
234
+ - type: map_at_5
235
+ value: 45.734
236
+ - type: mrr_at_1
237
+ value: 42.203
238
+ - type: mrr_at_10
239
+ value: 53.315999999999995
240
+ - type: mrr_at_100
241
+ value: 53.977
242
+ - type: mrr_at_1000
243
+ value: 54.001
244
+ - type: mrr_at_3
245
+ value: 50.381
246
+ - type: mrr_at_5
247
+ value: 52.198
248
+ - type: ndcg_at_1
249
+ value: 42.203
250
+ - type: ndcg_at_10
251
+ value: 55.143
252
+ - type: ndcg_at_100
253
+ value: 60.278
254
+ - type: ndcg_at_1000
255
+ value: 61.497
256
+ - type: ndcg_at_3
257
+ value: 48.9
258
+ - type: ndcg_at_5
259
+ value: 51.712
260
+ - type: precision_at_1
261
+ value: 42.203
262
+ - type: precision_at_10
263
+ value: 11.016
264
+ - type: precision_at_100
265
+ value: 1.718
266
+ - type: precision_at_1000
267
+ value: 0.219
268
+ - type: precision_at_3
269
+ value: 24.224999999999998
270
+ - type: precision_at_5
271
+ value: 17.711
272
+ - type: recall_at_1
273
+ value: 34.588
274
+ - type: recall_at_10
275
+ value: 69.91000000000001
276
+ - type: recall_at_100
277
+ value: 91.01700000000001
278
+ - type: recall_at_1000
279
+ value: 98.02199999999999
280
+ - type: recall_at_3
281
+ value: 51.9
282
+ - type: recall_at_5
283
+ value: 59.604
284
+ - task:
285
+ type: Retrieval
286
+ dataset:
287
+ type: BeIR/cqadupstack
288
+ name: MTEB CQADupstackEnglishRetrieval
289
+ config: default
290
+ split: test
291
+ revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
292
+ metrics:
293
+ - type: map_at_1
294
+ value: 35.649
295
+ - type: map_at_10
296
+ value: 47.713
297
+ - type: map_at_100
298
+ value: 49.043
299
+ - type: map_at_1000
300
+ value: 49.178
301
+ - type: map_at_3
302
+ value: 44.355
303
+ - type: map_at_5
304
+ value: 46.152
305
+ - type: mrr_at_1
306
+ value: 44.268
307
+ - type: mrr_at_10
308
+ value: 53.403999999999996
309
+ - type: mrr_at_100
310
+ value: 54.035999999999994
311
+ - type: mrr_at_1000
312
+ value: 54.078
313
+ - type: mrr_at_3
314
+ value: 51.507000000000005
315
+ - type: mrr_at_5
316
+ value: 52.583999999999996
317
+ - type: ndcg_at_1
318
+ value: 44.268
319
+ - type: ndcg_at_10
320
+ value: 53.679
321
+ - type: ndcg_at_100
322
+ value: 57.794000000000004
323
+ - type: ndcg_at_1000
324
+ value: 59.74
325
+ - type: ndcg_at_3
326
+ value: 49.348
327
+ - type: ndcg_at_5
328
+ value: 51.266999999999996
329
+ - type: precision_at_1
330
+ value: 44.268
331
+ - type: precision_at_10
332
+ value: 10.120999999999999
333
+ - type: precision_at_100
334
+ value: 1.566
335
+ - type: precision_at_1000
336
+ value: 0.20600000000000002
337
+ - type: precision_at_3
338
+ value: 23.864
339
+ - type: precision_at_5
340
+ value: 16.650000000000002
341
+ - type: recall_at_1
342
+ value: 35.649
343
+ - type: recall_at_10
344
+ value: 64.152
345
+ - type: recall_at_100
346
+ value: 81.096
347
+ - type: recall_at_1000
348
+ value: 92.957
349
+ - type: recall_at_3
350
+ value: 51.498
351
+ - type: recall_at_5
352
+ value: 56.977
353
+ - task:
354
+ type: Retrieval
355
+ dataset:
356
+ type: BeIR/cqadupstack
357
+ name: MTEB CQADupstackGamingRetrieval
358
+ config: default
359
+ split: test
360
+ revision: 4885aa143210c98657558c04aaf3dc47cfb54340
361
+ metrics:
362
+ - type: map_at_1
363
+ value: 38.372
364
+ - type: map_at_10
365
+ value: 52.693
366
+ - type: map_at_100
367
+ value: 53.796
368
+ - type: map_at_1000
369
+ value: 53.836
370
+ - type: map_at_3
371
+ value: 48.818
372
+ - type: map_at_5
373
+ value: 51.052
374
+ - type: mrr_at_1
375
+ value: 44.013000000000005
376
+ - type: mrr_at_10
377
+ value: 55.769999999999996
378
+ - type: mrr_at_100
379
+ value: 56.415000000000006
380
+ - type: mrr_at_1000
381
+ value: 56.435
382
+ - type: mrr_at_3
383
+ value: 52.884
384
+ - type: mrr_at_5
385
+ value: 54.552
386
+ - type: ndcg_at_1
387
+ value: 44.013000000000005
388
+ - type: ndcg_at_10
389
+ value: 59.45
390
+ - type: ndcg_at_100
391
+ value: 63.422
392
+ - type: ndcg_at_1000
393
+ value: 64.214
394
+ - type: ndcg_at_3
395
+ value: 52.829
396
+ - type: ndcg_at_5
397
+ value: 56.079
398
+ - type: precision_at_1
399
+ value: 44.013000000000005
400
+ - type: precision_at_10
401
+ value: 9.912
402
+ - type: precision_at_100
403
+ value: 1.286
404
+ - type: precision_at_1000
405
+ value: 0.13899999999999998
406
+ - type: precision_at_3
407
+ value: 23.992
408
+ - type: precision_at_5
409
+ value: 16.803
410
+ - type: recall_at_1
411
+ value: 38.372
412
+ - type: recall_at_10
413
+ value: 76.279
414
+ - type: recall_at_100
415
+ value: 92.842
416
+ - type: recall_at_1000
417
+ value: 98.41
418
+ - type: recall_at_3
419
+ value: 58.738
420
+ - type: recall_at_5
421
+ value: 66.51899999999999
422
+ - task:
423
+ type: Retrieval
424
+ dataset:
425
+ type: BeIR/cqadupstack
426
+ name: MTEB CQADupstackGisRetrieval
427
+ config: default
428
+ split: test
429
+ revision: 5003b3064772da1887988e05400cf3806fe491f2
430
+ metrics:
431
+ - type: map_at_1
432
+ value: 26.784999999999997
433
+ - type: map_at_10
434
+ value: 37.152
435
+ - type: map_at_100
436
+ value: 38.371
437
+ - type: map_at_1000
438
+ value: 38.437
439
+ - type: map_at_3
440
+ value: 34.211999999999996
441
+ - type: map_at_5
442
+ value: 35.791000000000004
443
+ - type: mrr_at_1
444
+ value: 29.153000000000002
445
+ - type: mrr_at_10
446
+ value: 39.312999999999995
447
+ - type: mrr_at_100
448
+ value: 40.32
449
+ - type: mrr_at_1000
450
+ value: 40.367999999999995
451
+ - type: mrr_at_3
452
+ value: 36.760999999999996
453
+ - type: mrr_at_5
454
+ value: 38.083
455
+ - type: ndcg_at_1
456
+ value: 29.153000000000002
457
+ - type: ndcg_at_10
458
+ value: 42.785000000000004
459
+ - type: ndcg_at_100
460
+ value: 48.613
461
+ - type: ndcg_at_1000
462
+ value: 50.166
463
+ - type: ndcg_at_3
464
+ value: 37.255
465
+ - type: ndcg_at_5
466
+ value: 39.763999999999996
467
+ - type: precision_at_1
468
+ value: 29.153000000000002
469
+ - type: precision_at_10
470
+ value: 6.734
471
+ - type: precision_at_100
472
+ value: 1.0250000000000001
473
+ - type: precision_at_1000
474
+ value: 0.11900000000000001
475
+ - type: precision_at_3
476
+ value: 16.234
477
+ - type: precision_at_5
478
+ value: 11.232000000000001
479
+ - type: recall_at_1
480
+ value: 26.784999999999997
481
+ - type: recall_at_10
482
+ value: 57.915000000000006
483
+ - type: recall_at_100
484
+ value: 84.473
485
+ - type: recall_at_1000
486
+ value: 96.011
487
+ - type: recall_at_3
488
+ value: 43.105
489
+ - type: recall_at_5
490
+ value: 49.15
491
+ - task:
492
+ type: Retrieval
493
+ dataset:
494
+ type: BeIR/cqadupstack
495
+ name: MTEB CQADupstackMathematicaRetrieval
496
+ config: default
497
+ split: test
498
+ revision: 90fceea13679c63fe563ded68f3b6f06e50061de
499
+ metrics:
500
+ - type: map_at_1
501
+ value: 20.24
502
+ - type: map_at_10
503
+ value: 31.493
504
+ - type: map_at_100
505
+ value: 32.771
506
+ - type: map_at_1000
507
+ value: 32.883
508
+ - type: map_at_3
509
+ value: 27.062
510
+ - type: map_at_5
511
+ value: 29.421999999999997
512
+ - type: mrr_at_1
513
+ value: 25.622
514
+ - type: mrr_at_10
515
+ value: 35.729
516
+ - type: mrr_at_100
517
+ value: 36.613
518
+ - type: mrr_at_1000
519
+ value: 36.665
520
+ - type: mrr_at_3
521
+ value: 32.048
522
+ - type: mrr_at_5
523
+ value: 34.169
524
+ - type: ndcg_at_1
525
+ value: 25.622
526
+ - type: ndcg_at_10
527
+ value: 38.463
528
+ - type: ndcg_at_100
529
+ value: 43.909
530
+ - type: ndcg_at_1000
531
+ value: 46.21
532
+ - type: ndcg_at_3
533
+ value: 30.563000000000002
534
+ - type: ndcg_at_5
535
+ value: 34.178999999999995
536
+ - type: precision_at_1
537
+ value: 25.622
538
+ - type: precision_at_10
539
+ value: 7.7490000000000006
540
+ - type: precision_at_100
541
+ value: 1.1780000000000002
542
+ - type: precision_at_1000
543
+ value: 0.149
544
+ - type: precision_at_3
545
+ value: 15.049999999999999
546
+ - type: precision_at_5
547
+ value: 11.616999999999999
548
+ - type: recall_at_1
549
+ value: 20.24
550
+ - type: recall_at_10
551
+ value: 55.657000000000004
552
+ - type: recall_at_100
553
+ value: 78.803
554
+ - type: recall_at_1000
555
+ value: 94.801
556
+ - type: recall_at_3
557
+ value: 34.171
558
+ - type: recall_at_5
559
+ value: 43.16
560
+ - task:
561
+ type: Retrieval
562
+ dataset:
563
+ type: BeIR/cqadupstack
564
+ name: MTEB CQADupstackPhysicsRetrieval
565
+ config: default
566
+ split: test
567
+ revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
568
+ metrics:
569
+ - type: map_at_1
570
+ value: 32.501000000000005
571
+ - type: map_at_10
572
+ value: 46.286
573
+ - type: map_at_100
574
+ value: 47.732
575
+ - type: map_at_1000
576
+ value: 47.814
577
+ - type: map_at_3
578
+ value: 41.957
579
+ - type: map_at_5
580
+ value: 44.506
581
+ - type: mrr_at_1
582
+ value: 39.75
583
+ - type: mrr_at_10
584
+ value: 51.285000000000004
585
+ - type: mrr_at_100
586
+ value: 52.051
587
+ - type: mrr_at_1000
588
+ value: 52.075
589
+ - type: mrr_at_3
590
+ value: 48.315999999999995
591
+ - type: mrr_at_5
592
+ value: 50.125
593
+ - type: ndcg_at_1
594
+ value: 39.75
595
+ - type: ndcg_at_10
596
+ value: 53.361999999999995
597
+ - type: ndcg_at_100
598
+ value: 58.703
599
+ - type: ndcg_at_1000
600
+ value: 59.962
601
+ - type: ndcg_at_3
602
+ value: 46.786
603
+ - type: ndcg_at_5
604
+ value: 50.169
605
+ - type: precision_at_1
606
+ value: 39.75
607
+ - type: precision_at_10
608
+ value: 10.154
609
+ - type: precision_at_100
610
+ value: 1.485
611
+ - type: precision_at_1000
612
+ value: 0.17600000000000002
613
+ - type: precision_at_3
614
+ value: 23.003
615
+ - type: precision_at_5
616
+ value: 16.766000000000002
617
+ - type: recall_at_1
618
+ value: 32.501000000000005
619
+ - type: recall_at_10
620
+ value: 68.901
621
+ - type: recall_at_100
622
+ value: 90.527
623
+ - type: recall_at_1000
624
+ value: 98.307
625
+ - type: recall_at_3
626
+ value: 51.056000000000004
627
+ - type: recall_at_5
628
+ value: 59.471
629
+ - task:
630
+ type: Retrieval
631
+ dataset:
632
+ type: BeIR/cqadupstack
633
+ name: MTEB CQADupstackProgrammersRetrieval
634
+ config: default
635
+ split: test
636
+ revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
637
+ metrics:
638
+ - type: map_at_1
639
+ value: 27.962999999999997
640
+ - type: map_at_10
641
+ value: 41.434
642
+ - type: map_at_100
643
+ value: 42.961
644
+ - type: map_at_1000
645
+ value: 43.051
646
+ - type: map_at_3
647
+ value: 37.579
648
+ - type: map_at_5
649
+ value: 39.579
650
+ - type: mrr_at_1
651
+ value: 34.932
652
+ - type: mrr_at_10
653
+ value: 46.455999999999996
654
+ - type: mrr_at_100
655
+ value: 47.362
656
+ - type: mrr_at_1000
657
+ value: 47.398
658
+ - type: mrr_at_3
659
+ value: 43.855
660
+ - type: mrr_at_5
661
+ value: 45.322
662
+ - type: ndcg_at_1
663
+ value: 34.932
664
+ - type: ndcg_at_10
665
+ value: 48.323
666
+ - type: ndcg_at_100
667
+ value: 54.173
668
+ - type: ndcg_at_1000
669
+ value: 55.69
670
+ - type: ndcg_at_3
671
+ value: 42.498000000000005
672
+ - type: ndcg_at_5
673
+ value: 44.973
674
+ - type: precision_at_1
675
+ value: 34.932
676
+ - type: precision_at_10
677
+ value: 9.224
678
+ - type: precision_at_100
679
+ value: 1.429
680
+ - type: precision_at_1000
681
+ value: 0.172
682
+ - type: precision_at_3
683
+ value: 21.005
684
+ - type: precision_at_5
685
+ value: 15.0
686
+ - type: recall_at_1
687
+ value: 27.962999999999997
688
+ - type: recall_at_10
689
+ value: 63.563
690
+ - type: recall_at_100
691
+ value: 87.679
692
+ - type: recall_at_1000
693
+ value: 97.381
694
+ - type: recall_at_3
695
+ value: 47.205999999999996
696
+ - type: recall_at_5
697
+ value: 53.784
698
+ - task:
699
+ type: Retrieval
700
+ dataset:
701
+ type: BeIR/cqadupstack
702
+ name: MTEB CQADupstackRetrieval
703
+ config: default
704
+ split: test
705
+ revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
706
+ metrics:
707
+ - type: map_at_1
708
+ value: 27.9895
709
+ - type: map_at_10
710
+ value: 39.67808333333333
711
+ - type: map_at_100
712
+ value: 41.05
713
+ - type: map_at_1000
714
+ value: 41.15800000000001
715
+ - type: map_at_3
716
+ value: 36.079499999999996
717
+ - type: map_at_5
718
+ value: 38.056749999999994
719
+ - type: mrr_at_1
720
+ value: 33.405583333333325
721
+ - type: mrr_at_10
722
+ value: 43.6965
723
+ - type: mrr_at_100
724
+ value: 44.568000000000005
725
+ - type: mrr_at_1000
726
+ value: 44.61208333333334
727
+ - type: mrr_at_3
728
+ value: 40.96574999999999
729
+ - type: mrr_at_5
730
+ value: 42.529833333333336
731
+ - type: ndcg_at_1
732
+ value: 33.405583333333325
733
+ - type: ndcg_at_10
734
+ value: 46.016
735
+ - type: ndcg_at_100
736
+ value: 51.39475
737
+ - type: ndcg_at_1000
738
+ value: 53.17333333333334
739
+ - type: ndcg_at_3
740
+ value: 40.166666666666664
741
+ - type: ndcg_at_5
742
+ value: 42.899750000000004
743
+ - type: precision_at_1
744
+ value: 33.405583333333325
745
+ - type: precision_at_10
746
+ value: 8.408999999999999
747
+ - type: precision_at_100
748
+ value: 1.3129166666666665
749
+ - type: precision_at_1000
750
+ value: 0.16583333333333336
751
+ - type: precision_at_3
752
+ value: 19.05825
753
+ - type: precision_at_5
754
+ value: 13.6845
755
+ - type: recall_at_1
756
+ value: 27.9895
757
+ - type: recall_at_10
758
+ value: 60.572416666666676
759
+ - type: recall_at_100
760
+ value: 83.63975
761
+ - type: recall_at_1000
762
+ value: 95.58775
763
+ - type: recall_at_3
764
+ value: 44.402750000000005
765
+ - type: recall_at_5
766
+ value: 51.40116666666666
767
+ - task:
768
+ type: Retrieval
769
+ dataset:
770
+ type: BeIR/cqadupstack
771
+ name: MTEB CQADupstackStatsRetrieval
772
+ config: default
773
+ split: test
774
+ revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
775
+ metrics:
776
+ - type: map_at_1
777
+ value: 24.451
778
+ - type: map_at_10
779
+ value: 34.526
780
+ - type: map_at_100
781
+ value: 35.732
782
+ - type: map_at_1000
783
+ value: 35.824
784
+ - type: map_at_3
785
+ value: 31.503999999999998
786
+ - type: map_at_5
787
+ value: 33.241
788
+ - type: mrr_at_1
789
+ value: 28.221
790
+ - type: mrr_at_10
791
+ value: 37.34
792
+ - type: mrr_at_100
793
+ value: 38.389
794
+ - type: mrr_at_1000
795
+ value: 38.443
796
+ - type: mrr_at_3
797
+ value: 34.714
798
+ - type: mrr_at_5
799
+ value: 36.217
800
+ - type: ndcg_at_1
801
+ value: 28.221
802
+ - type: ndcg_at_10
803
+ value: 40.105000000000004
804
+ - type: ndcg_at_100
805
+ value: 45.619
806
+ - type: ndcg_at_1000
807
+ value: 47.597
808
+ - type: ndcg_at_3
809
+ value: 34.711
810
+ - type: ndcg_at_5
811
+ value: 37.38
812
+ - type: precision_at_1
813
+ value: 28.221
814
+ - type: precision_at_10
815
+ value: 6.7330000000000005
816
+ - type: precision_at_100
817
+ value: 1.0170000000000001
818
+ - type: precision_at_1000
819
+ value: 0.126
820
+ - type: precision_at_3
821
+ value: 15.798000000000002
822
+ - type: precision_at_5
823
+ value: 11.227
824
+ - type: recall_at_1
825
+ value: 24.451
826
+ - type: recall_at_10
827
+ value: 54.332
828
+ - type: recall_at_100
829
+ value: 78.842
830
+ - type: recall_at_1000
831
+ value: 92.868
832
+ - type: recall_at_3
833
+ value: 39.495999999999995
834
+ - type: recall_at_5
835
+ value: 46.198
836
+ - task:
837
+ type: Retrieval
838
+ dataset:
839
+ type: BeIR/cqadupstack
840
+ name: MTEB CQADupstackTexRetrieval
841
+ config: default
842
+ split: test
843
+ revision: 46989137a86843e03a6195de44b09deda022eec7
844
+ metrics:
845
+ - type: map_at_1
846
+ value: 18.989
847
+ - type: map_at_10
848
+ value: 28.189999999999998
849
+ - type: map_at_100
850
+ value: 29.575000000000003
851
+ - type: map_at_1000
852
+ value: 29.705
853
+ - type: map_at_3
854
+ value: 25.406000000000002
855
+ - type: map_at_5
856
+ value: 26.851000000000003
857
+ - type: mrr_at_1
858
+ value: 23.400000000000002
859
+ - type: mrr_at_10
860
+ value: 32.231
861
+ - type: mrr_at_100
862
+ value: 33.239000000000004
863
+ - type: mrr_at_1000
864
+ value: 33.309
865
+ - type: mrr_at_3
866
+ value: 29.869
867
+ - type: mrr_at_5
868
+ value: 31.102999999999998
869
+ - type: ndcg_at_1
870
+ value: 23.400000000000002
871
+ - type: ndcg_at_10
872
+ value: 33.634
873
+ - type: ndcg_at_100
874
+ value: 39.772999999999996
875
+ - type: ndcg_at_1000
876
+ value: 42.385
877
+ - type: ndcg_at_3
878
+ value: 28.938999999999997
879
+ - type: ndcg_at_5
880
+ value: 30.913
881
+ - type: precision_at_1
882
+ value: 23.400000000000002
883
+ - type: precision_at_10
884
+ value: 6.366
885
+ - type: precision_at_100
886
+ value: 1.1159999999999999
887
+ - type: precision_at_1000
888
+ value: 0.153
889
+ - type: precision_at_3
890
+ value: 14.212
891
+ - type: precision_at_5
892
+ value: 10.151
893
+ - type: recall_at_1
894
+ value: 18.989
895
+ - type: recall_at_10
896
+ value: 45.837
897
+ - type: recall_at_100
898
+ value: 73.04899999999999
899
+ - type: recall_at_1000
900
+ value: 91.245
901
+ - type: recall_at_3
902
+ value: 32.309
903
+ - type: recall_at_5
904
+ value: 37.665
905
+ - task:
906
+ type: Retrieval
907
+ dataset:
908
+ type: BeIR/cqadupstack
909
+ name: MTEB CQADupstackUnixRetrieval
910
+ config: default
911
+ split: test
912
+ revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
913
+ metrics:
914
+ - type: map_at_1
915
+ value: 30.595
916
+ - type: map_at_10
917
+ value: 42.286
918
+ - type: map_at_100
919
+ value: 43.586999999999996
920
+ - type: map_at_1000
921
+ value: 43.669000000000004
922
+ - type: map_at_3
923
+ value: 38.888
924
+ - type: map_at_5
925
+ value: 40.669
926
+ - type: mrr_at_1
927
+ value: 36.287000000000006
928
+ - type: mrr_at_10
929
+ value: 46.405
930
+ - type: mrr_at_100
931
+ value: 47.282999999999994
932
+ - type: mrr_at_1000
933
+ value: 47.327000000000005
934
+ - type: mrr_at_3
935
+ value: 43.874
936
+ - type: mrr_at_5
937
+ value: 45.414
938
+ - type: ndcg_at_1
939
+ value: 36.287000000000006
940
+ - type: ndcg_at_10
941
+ value: 48.407
942
+ - type: ndcg_at_100
943
+ value: 53.824000000000005
944
+ - type: ndcg_at_1000
945
+ value: 55.483000000000004
946
+ - type: ndcg_at_3
947
+ value: 42.9
948
+ - type: ndcg_at_5
949
+ value: 45.391999999999996
950
+ - type: precision_at_1
951
+ value: 36.287000000000006
952
+ - type: precision_at_10
953
+ value: 8.414000000000001
954
+ - type: precision_at_100
955
+ value: 1.232
956
+ - type: precision_at_1000
957
+ value: 0.147
958
+ - type: precision_at_3
959
+ value: 20.118
960
+ - type: precision_at_5
961
+ value: 13.993
962
+ - type: recall_at_1
963
+ value: 30.595
964
+ - type: recall_at_10
965
+ value: 62.656
966
+ - type: recall_at_100
967
+ value: 85.74199999999999
968
+ - type: recall_at_1000
969
+ value: 96.854
970
+ - type: recall_at_3
971
+ value: 47.413
972
+ - type: recall_at_5
973
+ value: 54.04
974
+ - task:
975
+ type: Retrieval
976
+ dataset:
977
+ type: BeIR/cqadupstack
978
+ name: MTEB CQADupstackWebmastersRetrieval
979
+ config: default
980
+ split: test
981
+ revision: 160c094312a0e1facb97e55eeddb698c0abe3571
982
+ metrics:
983
+ - type: map_at_1
984
+ value: 28.236
985
+ - type: map_at_10
986
+ value: 39.751
987
+ - type: map_at_100
988
+ value: 41.435
989
+ - type: map_at_1000
990
+ value: 41.677
991
+ - type: map_at_3
992
+ value: 35.957
993
+ - type: map_at_5
994
+ value: 38.112
995
+ - type: mrr_at_1
996
+ value: 33.794000000000004
997
+ - type: mrr_at_10
998
+ value: 44.449
999
+ - type: mrr_at_100
1000
+ value: 45.268
1001
+ - type: mrr_at_1000
1002
+ value: 45.311
1003
+ - type: mrr_at_3
1004
+ value: 41.502
1005
+ - type: mrr_at_5
1006
+ value: 43.142
1007
+ - type: ndcg_at_1
1008
+ value: 33.794000000000004
1009
+ - type: ndcg_at_10
1010
+ value: 46.787
1011
+ - type: ndcg_at_100
1012
+ value: 52.290000000000006
1013
+ - type: ndcg_at_1000
1014
+ value: 54.336
1015
+ - type: ndcg_at_3
1016
+ value: 40.78
1017
+ - type: ndcg_at_5
1018
+ value: 43.669999999999995
1019
+ - type: precision_at_1
1020
+ value: 33.794000000000004
1021
+ - type: precision_at_10
1022
+ value: 9.051
1023
+ - type: precision_at_100
1024
+ value: 1.7919999999999998
1025
+ - type: precision_at_1000
1026
+ value: 0.259
1027
+ - type: precision_at_3
1028
+ value: 19.368
1029
+ - type: precision_at_5
1030
+ value: 14.229
1031
+ - type: recall_at_1
1032
+ value: 28.236
1033
+ - type: recall_at_10
1034
+ value: 61.358000000000004
1035
+ - type: recall_at_100
1036
+ value: 85.028
1037
+ - type: recall_at_1000
1038
+ value: 97.813
1039
+ - type: recall_at_3
1040
+ value: 44.207
1041
+ - type: recall_at_5
1042
+ value: 51.885000000000005
1043
+ - task:
1044
+ type: Retrieval
1045
+ dataset:
1046
+ type: BeIR/cqadupstack
1047
+ name: MTEB CQADupstackWordpressRetrieval
1048
+ config: default
1049
+ split: test
1050
+ revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
1051
+ metrics:
1052
+ - type: map_at_1
1053
+ value: 17.505000000000003
1054
+ - type: map_at_10
1055
+ value: 26.762000000000004
1056
+ - type: map_at_100
1057
+ value: 28.113
1058
+ - type: map_at_1000
1059
+ value: 28.222
1060
+ - type: map_at_3
1061
+ value: 23.876
1062
+ - type: map_at_5
1063
+ value: 25.572
1064
+ - type: mrr_at_1
1065
+ value: 19.224
1066
+ - type: mrr_at_10
1067
+ value: 28.660000000000004
1068
+ - type: mrr_at_100
1069
+ value: 29.863
1070
+ - type: mrr_at_1000
1071
+ value: 29.935000000000002
1072
+ - type: mrr_at_3
1073
+ value: 25.878
1074
+ - type: mrr_at_5
1075
+ value: 27.449
1076
+ - type: ndcg_at_1
1077
+ value: 19.224
1078
+ - type: ndcg_at_10
1079
+ value: 32.054
1080
+ - type: ndcg_at_100
1081
+ value: 38.339
1082
+ - type: ndcg_at_1000
1083
+ value: 40.8
1084
+ - type: ndcg_at_3
1085
+ value: 26.491
1086
+ - type: ndcg_at_5
1087
+ value: 29.298999999999996
1088
+ - type: precision_at_1
1089
+ value: 19.224
1090
+ - type: precision_at_10
1091
+ value: 5.434
1092
+ - type: precision_at_100
1093
+ value: 0.911
1094
+ - type: precision_at_1000
1095
+ value: 0.125
1096
+ - type: precision_at_3
1097
+ value: 11.83
1098
+ - type: precision_at_5
1099
+ value: 8.834999999999999
1100
+ - type: recall_at_1
1101
+ value: 17.505000000000003
1102
+ - type: recall_at_10
1103
+ value: 46.309
1104
+ - type: recall_at_100
1105
+ value: 74.579
1106
+ - type: recall_at_1000
1107
+ value: 92.384
1108
+ - type: recall_at_3
1109
+ value: 31.734
1110
+ - type: recall_at_5
1111
+ value: 38.361000000000004
1112
+ - task:
1113
+ type: Classification
1114
+ dataset:
1115
+ type: mteb/emotion
1116
+ name: MTEB EmotionClassification
1117
+ config: default
1118
+ split: test
1119
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1120
+ metrics:
1121
+ - type: accuracy
1122
+ value: 67.085
1123
+ - type: f1
1124
+ value: 61.019909873305686
1125
+ - task:
1126
+ type: Retrieval
1127
+ dataset:
1128
+ type: mteb/fiqa
1129
+ name: MTEB FiQA2018
1130
+ config: default
1131
+ split: test
1132
+ revision: 27a168819829fe9bcd655c2df245fb19452e8e06
1133
+ metrics:
1134
+ - type: map_at_1
1135
+ value: 32.251999999999995
1136
+ - type: map_at_10
1137
+ value: 53.98500000000001
1138
+ - type: map_at_100
1139
+ value: 56.093
1140
+ - type: map_at_1000
1141
+ value: 56.198
1142
+ - type: map_at_3
1143
+ value: 46.765
1144
+ - type: map_at_5
1145
+ value: 50.739999999999995
1146
+ - type: mrr_at_1
1147
+ value: 60.956999999999994
1148
+ - type: mrr_at_10
1149
+ value: 69.38600000000001
1150
+ - type: mrr_at_100
1151
+ value: 69.877
1152
+ - type: mrr_at_1000
1153
+ value: 69.884
1154
+ - type: mrr_at_3
1155
+ value: 67.052
1156
+ - type: mrr_at_5
1157
+ value: 68.356
1158
+ - type: ndcg_at_1
1159
+ value: 60.956999999999994
1160
+ - type: ndcg_at_10
1161
+ value: 62.78399999999999
1162
+ - type: ndcg_at_100
1163
+ value: 68.743
1164
+ - type: ndcg_at_1000
1165
+ value: 69.92399999999999
1166
+ - type: ndcg_at_3
1167
+ value: 57.336
1168
+ - type: ndcg_at_5
1169
+ value: 59.121
1170
+ - type: precision_at_1
1171
+ value: 60.956999999999994
1172
+ - type: precision_at_10
1173
+ value: 17.346
1174
+ - type: precision_at_100
1175
+ value: 2.3689999999999998
1176
+ - type: precision_at_1000
1177
+ value: 0.259
1178
+ - type: precision_at_3
1179
+ value: 37.912
1180
+ - type: precision_at_5
1181
+ value: 27.900999999999996
1182
+ - type: recall_at_1
1183
+ value: 32.251999999999995
1184
+ - type: recall_at_10
1185
+ value: 71.616
1186
+ - type: recall_at_100
1187
+ value: 92.685
1188
+ - type: recall_at_1000
1189
+ value: 98.983
1190
+ - type: recall_at_3
1191
+ value: 52.064
1192
+ - type: recall_at_5
1193
+ value: 60.49099999999999
1194
+ - task:
1195
+ type: Classification
1196
+ dataset:
1197
+ type: mteb/imdb
1198
+ name: MTEB ImdbClassification
1199
+ config: default
1200
+ split: test
1201
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1202
+ metrics:
1203
+ - type: accuracy
1204
+ value: 96.4592
1205
+ - type: ap
1206
+ value: 94.57299077219179
1207
+ - type: f1
1208
+ value: 96.45842059801627
1209
+ - task:
1210
+ type: Classification
1211
+ dataset:
1212
+ type: mteb/mtop_domain
1213
+ name: MTEB MTOPDomainClassification (en)
1214
+ config: en
1215
+ split: test
1216
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1217
+ metrics:
1218
+ - type: accuracy
1219
+ value: 98.45873233014134
1220
+ - type: f1
1221
+ value: 98.38426074551533
1222
+ - task:
1223
+ type: Classification
1224
+ dataset:
1225
+ type: mteb/mtop_intent
1226
+ name: MTEB MTOPIntentClassification (en)
1227
+ config: en
1228
+ split: test
1229
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1230
+ metrics:
1231
+ - type: accuracy
1232
+ value: 90.01823985408116
1233
+ - type: f1
1234
+ value: 70.71419843084274
1235
+ - task:
1236
+ type: Classification
1237
+ dataset:
1238
+ type: mteb/amazon_massive_intent
1239
+ name: MTEB MassiveIntentClassification (en)
1240
+ config: en
1241
+ split: test
1242
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1243
+ metrics:
1244
+ - type: accuracy
1245
+ value: 84.35104236718225
1246
+ - type: f1
1247
+ value: 82.50884520186432
1248
+ - task:
1249
+ type: Classification
1250
+ dataset:
1251
+ type: mteb/amazon_massive_scenario
1252
+ name: MTEB MassiveScenarioClassification (en)
1253
+ config: en
1254
+ split: test
1255
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1256
+ metrics:
1257
+ - type: accuracy
1258
+ value: 88.0665770006725
1259
+ - type: f1
1260
+ value: 87.06928510969733
1261
+ - task:
1262
+ type: Clustering
1263
+ dataset:
1264
+ type: mteb/medrxiv-clustering-p2p
1265
+ name: MTEB MedrxivClusteringP2P
1266
+ config: default
1267
+ split: test
1268
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1269
+ metrics:
1270
+ - type: v_measure
1271
+ value: 46.053400985420204
1272
+ - task:
1273
+ type: Clustering
1274
+ dataset:
1275
+ type: mteb/medrxiv-clustering-s2s
1276
+ name: MTEB MedrxivClusteringS2S
1277
+ config: default
1278
+ split: test
1279
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1280
+ metrics:
1281
+ - type: v_measure
1282
+ value: 44.445957227318054
1283
+ - task:
1284
+ type: Reranking
1285
+ dataset:
1286
+ type: mteb/mind_small
1287
+ name: MTEB MindSmallReranking
1288
+ config: default
1289
+ split: test
1290
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1291
+ metrics:
1292
+ - type: map
1293
+ value: 33.277065197675775
1294
+ - type: mrr
1295
+ value: 34.654704063060656
1296
+ - task:
1297
+ type: Retrieval
1298
+ dataset:
1299
+ type: mteb/nfcorpus
1300
+ name: MTEB NFCorpus
1301
+ config: default
1302
+ split: test
1303
+ revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
1304
+ metrics:
1305
+ - type: map_at_1
1306
+ value: 6.728000000000001
1307
+ - type: map_at_10
1308
+ value: 15.052999999999999
1309
+ - type: map_at_100
1310
+ value: 19.901
1311
+ - type: map_at_1000
1312
+ value: 21.72
1313
+ - type: map_at_3
1314
+ value: 10.901
1315
+ - type: map_at_5
1316
+ value: 12.651000000000002
1317
+ - type: mrr_at_1
1318
+ value: 52.322
1319
+ - type: mrr_at_10
1320
+ value: 60.614999999999995
1321
+ - type: mrr_at_100
1322
+ value: 61.199000000000005
1323
+ - type: mrr_at_1000
1324
+ value: 61.227
1325
+ - type: mrr_at_3
1326
+ value: 58.977999999999994
1327
+ - type: mrr_at_5
1328
+ value: 59.907
1329
+ - type: ndcg_at_1
1330
+ value: 50.619
1331
+ - type: ndcg_at_10
1332
+ value: 40.278000000000006
1333
+ - type: ndcg_at_100
1334
+ value: 37.585
1335
+ - type: ndcg_at_1000
1336
+ value: 46.459
1337
+ - type: ndcg_at_3
1338
+ value: 46.143
1339
+ - type: ndcg_at_5
1340
+ value: 43.7
1341
+ - type: precision_at_1
1342
+ value: 52.012
1343
+ - type: precision_at_10
1344
+ value: 30.154999999999998
1345
+ - type: precision_at_100
1346
+ value: 9.87
1347
+ - type: precision_at_1000
1348
+ value: 2.343
1349
+ - type: precision_at_3
1350
+ value: 42.931000000000004
1351
+ - type: precision_at_5
1352
+ value: 37.771
1353
+ - type: recall_at_1
1354
+ value: 6.728000000000001
1355
+ - type: recall_at_10
1356
+ value: 19.372
1357
+ - type: recall_at_100
1358
+ value: 39.044000000000004
1359
+ - type: recall_at_1000
1360
+ value: 71.602
1361
+ - type: recall_at_3
1362
+ value: 12.328
1363
+ - type: recall_at_5
1364
+ value: 14.758
1365
+ - task:
1366
+ type: Retrieval
1367
+ dataset:
1368
+ type: mteb/quora
1369
+ name: MTEB QuoraRetrieval
1370
+ config: default
1371
+ split: test
1372
+ revision: None
1373
+ metrics:
1374
+ - type: map_at_1
1375
+ value: 72.421
1376
+ - type: map_at_10
1377
+ value: 86.648
1378
+ - type: map_at_100
1379
+ value: 87.258
1380
+ - type: map_at_1000
1381
+ value: 87.26899999999999
1382
+ - type: map_at_3
1383
+ value: 83.82
1384
+ - type: map_at_5
1385
+ value: 85.629
1386
+ - type: mrr_at_1
1387
+ value: 83.21
1388
+ - type: mrr_at_10
1389
+ value: 89.198
1390
+ - type: mrr_at_100
1391
+ value: 89.277
1392
+ - type: mrr_at_1000
1393
+ value: 89.277
1394
+ - type: mrr_at_3
1395
+ value: 88.428
1396
+ - type: mrr_at_5
1397
+ value: 88.98
1398
+ - type: ndcg_at_1
1399
+ value: 83.24000000000001
1400
+ - type: ndcg_at_10
1401
+ value: 90.067
1402
+ - type: ndcg_at_100
1403
+ value: 91.091
1404
+ - type: ndcg_at_1000
1405
+ value: 91.146
1406
+ - type: ndcg_at_3
1407
+ value: 87.6
1408
+ - type: ndcg_at_5
1409
+ value: 89.004
1410
+ - type: precision_at_1
1411
+ value: 83.24000000000001
1412
+ - type: precision_at_10
1413
+ value: 13.644
1414
+ - type: precision_at_100
1415
+ value: 1.542
1416
+ - type: precision_at_1000
1417
+ value: 0.157
1418
+ - type: precision_at_3
1419
+ value: 38.437
1420
+ - type: precision_at_5
1421
+ value: 25.194
1422
+ - type: recall_at_1
1423
+ value: 72.421
1424
+ - type: recall_at_10
1425
+ value: 96.49600000000001
1426
+ - type: recall_at_100
1427
+ value: 99.802
1428
+ - type: recall_at_1000
1429
+ value: 100.0
1430
+ - type: recall_at_3
1431
+ value: 89.31400000000001
1432
+ - type: recall_at_5
1433
+ value: 93.363
1434
+ - task:
1435
+ type: Clustering
1436
+ dataset:
1437
+ type: mteb/reddit-clustering
1438
+ name: MTEB RedditClustering
1439
+ config: default
1440
+ split: test
1441
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1442
+ metrics:
1443
+ - type: v_measure
1444
+ value: 73.97491289906442
1445
+ - task:
1446
+ type: Clustering
1447
+ dataset:
1448
+ type: mteb/reddit-clustering-p2p
1449
+ name: MTEB RedditClusteringP2P
1450
+ config: default
1451
+ split: test
1452
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1453
+ metrics:
1454
+ - type: v_measure
1455
+ value: 73.49590001712183
1456
+ - task:
1457
+ type: Retrieval
1458
+ dataset:
1459
+ type: mteb/scidocs
1460
+ name: MTEB SCIDOCS
1461
+ config: default
1462
+ split: test
1463
+ revision: None
1464
+ metrics:
1465
+ - type: map_at_1
1466
+ value: 6.978
1467
+ - type: map_at_10
1468
+ value: 18.307000000000002
1469
+ - type: map_at_100
1470
+ value: 21.605
1471
+ - type: map_at_1000
1472
+ value: 21.965
1473
+ - type: map_at_3
1474
+ value: 12.642000000000001
1475
+ - type: map_at_5
1476
+ value: 15.453
1477
+ - type: mrr_at_1
1478
+ value: 34.300000000000004
1479
+ - type: mrr_at_10
1480
+ value: 46.886
1481
+ - type: mrr_at_100
1482
+ value: 47.78
1483
+ - type: mrr_at_1000
1484
+ value: 47.795
1485
+ - type: mrr_at_3
1486
+ value: 42.467
1487
+ - type: mrr_at_5
1488
+ value: 45.427
1489
+ - type: ndcg_at_1
1490
+ value: 34.300000000000004
1491
+ - type: ndcg_at_10
1492
+ value: 29.372999999999998
1493
+ - type: ndcg_at_100
1494
+ value: 40.355000000000004
1495
+ - type: ndcg_at_1000
1496
+ value: 45.221000000000004
1497
+ - type: ndcg_at_3
1498
+ value: 27.230999999999998
1499
+ - type: ndcg_at_5
1500
+ value: 24.352
1501
+ - type: precision_at_1
1502
+ value: 34.300000000000004
1503
+ - type: precision_at_10
1504
+ value: 15.36
1505
+ - type: precision_at_100
1506
+ value: 3.116
1507
+ - type: precision_at_1000
1508
+ value: 0.426
1509
+ - type: precision_at_3
1510
+ value: 25.367
1511
+ - type: precision_at_5
1512
+ value: 21.62
1513
+ - type: recall_at_1
1514
+ value: 6.978
1515
+ - type: recall_at_10
1516
+ value: 31.142999999999997
1517
+ - type: recall_at_100
1518
+ value: 63.27199999999999
1519
+ - type: recall_at_1000
1520
+ value: 86.512
1521
+ - type: recall_at_3
1522
+ value: 15.433
1523
+ - type: recall_at_5
1524
+ value: 21.918000000000003
1525
+ - task:
1526
+ type: STS
1527
+ dataset:
1528
+ type: mteb/sickr-sts
1529
+ name: MTEB SICK-R
1530
+ config: default
1531
+ split: test
1532
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1533
+ metrics:
1534
+ - type: cos_sim_pearson
1535
+ value: 81.90996932803432
1536
+ - type: cos_sim_spearman
1537
+ value: 78.73848819688604
1538
+ - type: euclidean_pearson
1539
+ value: 78.82008134820491
1540
+ - type: euclidean_spearman
1541
+ value: 78.73797968799013
1542
+ - type: manhattan_pearson
1543
+ value: 78.98817729907871
1544
+ - type: manhattan_spearman
1545
+ value: 78.88989195290672
1546
+ - task:
1547
+ type: STS
1548
+ dataset:
1549
+ type: mteb/sts12-sts
1550
+ name: MTEB STS12
1551
+ config: default
1552
+ split: test
1553
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1554
+ metrics:
1555
+ - type: cos_sim_pearson
1556
+ value: 86.9169693104017
1557
+ - type: cos_sim_spearman
1558
+ value: 78.6067489618467
1559
+ - type: euclidean_pearson
1560
+ value: 83.04545335395649
1561
+ - type: euclidean_spearman
1562
+ value: 78.6070135484733
1563
+ - type: manhattan_pearson
1564
+ value: 83.49435095447187
1565
+ - type: manhattan_spearman
1566
+ value: 78.9690144080464
1567
+ - task:
1568
+ type: STS
1569
+ dataset:
1570
+ type: mteb/sts13-sts
1571
+ name: MTEB STS13
1572
+ config: default
1573
+ split: test
1574
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1575
+ metrics:
1576
+ - type: cos_sim_pearson
1577
+ value: 88.86389266236574
1578
+ - type: cos_sim_spearman
1579
+ value: 88.88070867328447
1580
+ - type: euclidean_pearson
1581
+ value: 88.52907860408021
1582
+ - type: euclidean_spearman
1583
+ value: 88.88041097815055
1584
+ - type: manhattan_pearson
1585
+ value: 88.65795865729802
1586
+ - type: manhattan_spearman
1587
+ value: 89.09614539167227
1588
+ - task:
1589
+ type: STS
1590
+ dataset:
1591
+ type: mteb/sts14-sts
1592
+ name: MTEB STS14
1593
+ config: default
1594
+ split: test
1595
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
1596
+ metrics:
1597
+ - type: cos_sim_pearson
1598
+ value: 85.90258145848692
1599
+ - type: cos_sim_spearman
1600
+ value: 84.16679932371741
1601
+ - type: euclidean_pearson
1602
+ value: 84.95294032883719
1603
+ - type: euclidean_spearman
1604
+ value: 84.16781112349103
1605
+ - type: manhattan_pearson
1606
+ value: 85.18004344325733
1607
+ - type: manhattan_spearman
1608
+ value: 84.52374692147366
1609
+ - task:
1610
+ type: STS
1611
+ dataset:
1612
+ type: mteb/sts15-sts
1613
+ name: MTEB STS15
1614
+ config: default
1615
+ split: test
1616
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
1617
+ metrics:
1618
+ - type: cos_sim_pearson
1619
+ value: 87.89191454971963
1620
+ - type: cos_sim_spearman
1621
+ value: 88.44916193520294
1622
+ - type: euclidean_pearson
1623
+ value: 87.85883738567667
1624
+ - type: euclidean_spearman
1625
+ value: 88.44928880968476
1626
+ - type: manhattan_pearson
1627
+ value: 88.1871454451139
1628
+ - type: manhattan_spearman
1629
+ value: 88.94431200065807
1630
+ - task:
1631
+ type: STS
1632
+ dataset:
1633
+ type: mteb/sts16-sts
1634
+ name: MTEB STS16
1635
+ config: default
1636
+ split: test
1637
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
1638
+ metrics:
1639
+ - type: cos_sim_pearson
1640
+ value: 85.36716703986853
1641
+ - type: cos_sim_spearman
1642
+ value: 86.16132844716138
1643
+ - type: euclidean_pearson
1644
+ value: 85.25811478217042
1645
+ - type: euclidean_spearman
1646
+ value: 86.16215262183867
1647
+ - type: manhattan_pearson
1648
+ value: 85.43281209842574
1649
+ - type: manhattan_spearman
1650
+ value: 86.44640605346511
1651
+ - task:
1652
+ type: STS
1653
+ dataset:
1654
+ type: mteb/sts17-crosslingual-sts
1655
+ name: MTEB STS17 (en-en)
1656
+ config: en-en
1657
+ split: test
1658
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
1659
+ metrics:
1660
+ - type: cos_sim_pearson
1661
+ value: 88.29794966742152
1662
+ - type: cos_sim_spearman
1663
+ value: 88.27359278171622
1664
+ - type: euclidean_pearson
1665
+ value: 88.06469525438956
1666
+ - type: euclidean_spearman
1667
+ value: 88.28670070410784
1668
+ - type: manhattan_pearson
1669
+ value: 87.89087342332212
1670
+ - type: manhattan_spearman
1671
+ value: 88.11041644578535
1672
+ - task:
1673
+ type: STS
1674
+ dataset:
1675
+ type: mteb/sts22-crosslingual-sts
1676
+ name: MTEB STS22 (en)
1677
+ config: en
1678
+ split: test
1679
+ revision: eea2b4fe26a775864c896887d910b76a8098ad3f
1680
+ metrics:
1681
+ - type: cos_sim_pearson
1682
+ value: 66.75199645389645
1683
+ - type: cos_sim_spearman
1684
+ value: 66.20137384486978
1685
+ - type: euclidean_pearson
1686
+ value: 68.622513186352
1687
+ - type: euclidean_spearman
1688
+ value: 66.23640152769464
1689
+ - type: manhattan_pearson
1690
+ value: 68.97988448341921
1691
+ - type: manhattan_spearman
1692
+ value: 66.39142269154794
1693
+ - task:
1694
+ type: STS
1695
+ dataset:
1696
+ type: mteb/stsbenchmark-sts
1697
+ name: MTEB STSBenchmark
1698
+ config: default
1699
+ split: test
1700
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
1701
+ metrics:
1702
+ - type: cos_sim_pearson
1703
+ value: 86.48693548775047
1704
+ - type: cos_sim_spearman
1705
+ value: 86.08823308674964
1706
+ - type: euclidean_pearson
1707
+ value: 85.65692420470154
1708
+ - type: euclidean_spearman
1709
+ value: 86.08859480677167
1710
+ - type: manhattan_pearson
1711
+ value: 85.90164709250936
1712
+ - type: manhattan_spearman
1713
+ value: 86.40785365360473
1714
+ - task:
1715
+ type: Reranking
1716
+ dataset:
1717
+ type: mteb/scidocs-reranking
1718
+ name: MTEB SciDocsRR
1719
+ config: default
1720
+ split: test
1721
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
1722
+ metrics:
1723
+ - type: map
1724
+ value: 88.80093449044475
1725
+ - type: mrr
1726
+ value: 97.02094655526028
1727
+ - task:
1728
+ type: Retrieval
1729
+ dataset:
1730
+ type: mteb/scifact
1731
+ name: MTEB SciFact
1732
+ config: default
1733
+ split: test
1734
+ revision: 0228b52cf27578f30900b9e5271d331663a030d7
1735
+ metrics:
1736
+ - type: map_at_1
1737
+ value: 59.594
1738
+ - type: map_at_10
1739
+ value: 72.649
1740
+ - type: map_at_100
1741
+ value: 73.051
1742
+ - type: map_at_1000
1743
+ value: 73.056
1744
+ - type: map_at_3
1745
+ value: 69.667
1746
+ - type: map_at_5
1747
+ value: 71.528
1748
+ - type: mrr_at_1
1749
+ value: 62.666999999999994
1750
+ - type: mrr_at_10
1751
+ value: 73.625
1752
+ - type: mrr_at_100
1753
+ value: 73.956
1754
+ - type: mrr_at_1000
1755
+ value: 73.962
1756
+ - type: mrr_at_3
1757
+ value: 71.77799999999999
1758
+ - type: mrr_at_5
1759
+ value: 72.994
1760
+ - type: ndcg_at_1
1761
+ value: 62.666999999999994
1762
+ - type: ndcg_at_10
1763
+ value: 77.981
1764
+ - type: ndcg_at_100
1765
+ value: 79.474
1766
+ - type: ndcg_at_1000
1767
+ value: 79.569
1768
+ - type: ndcg_at_3
1769
+ value: 73.4
1770
+ - type: ndcg_at_5
1771
+ value: 75.806
1772
+ - type: precision_at_1
1773
+ value: 62.666999999999994
1774
+ - type: precision_at_10
1775
+ value: 10.567
1776
+ - type: precision_at_100
1777
+ value: 1.123
1778
+ - type: precision_at_1000
1779
+ value: 0.11299999999999999
1780
+ - type: precision_at_3
1781
+ value: 29.555999999999997
1782
+ - type: precision_at_5
1783
+ value: 19.467000000000002
1784
+ - type: recall_at_1
1785
+ value: 59.594
1786
+ - type: recall_at_10
1787
+ value: 93.167
1788
+ - type: recall_at_100
1789
+ value: 99.333
1790
+ - type: recall_at_1000
1791
+ value: 100.0
1792
+ - type: recall_at_3
1793
+ value: 80.72200000000001
1794
+ - type: recall_at_5
1795
+ value: 86.79400000000001
1796
+ - task:
1797
+ type: PairClassification
1798
+ dataset:
1799
+ type: mteb/sprintduplicatequestions-pairclassification
1800
+ name: MTEB SprintDuplicateQuestions
1801
+ config: default
1802
+ split: test
1803
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
1804
+ metrics:
1805
+ - type: cos_sim_accuracy
1806
+ value: 99.67920792079208
1807
+ - type: cos_sim_ap
1808
+ value: 91.12451155203843
1809
+ - type: cos_sim_f1
1810
+ value: 82.7763496143959
1811
+ - type: cos_sim_precision
1812
+ value: 85.18518518518519
1813
+ - type: cos_sim_recall
1814
+ value: 80.5
1815
+ - type: dot_accuracy
1816
+ value: 99.68019801980198
1817
+ - type: dot_ap
1818
+ value: 91.12360077338997
1819
+ - type: dot_f1
1820
+ value: 82.81893004115227
1821
+ - type: dot_precision
1822
+ value: 85.27542372881356
1823
+ - type: dot_recall
1824
+ value: 80.5
1825
+ - type: euclidean_accuracy
1826
+ value: 99.67920792079208
1827
+ - type: euclidean_ap
1828
+ value: 91.12526537243333
1829
+ - type: euclidean_f1
1830
+ value: 82.7763496143959
1831
+ - type: euclidean_precision
1832
+ value: 85.18518518518519
1833
+ - type: euclidean_recall
1834
+ value: 80.5
1835
+ - type: manhattan_accuracy
1836
+ value: 99.68613861386139
1837
+ - type: manhattan_ap
1838
+ value: 91.52045550487428
1839
+ - type: manhattan_f1
1840
+ value: 83.38461538461539
1841
+ - type: manhattan_precision
1842
+ value: 85.57894736842105
1843
+ - type: manhattan_recall
1844
+ value: 81.3
1845
+ - type: max_accuracy
1846
+ value: 99.68613861386139
1847
+ - type: max_ap
1848
+ value: 91.52045550487428
1849
+ - type: max_f1
1850
+ value: 83.38461538461539
1851
+ - task:
1852
+ type: Clustering
1853
+ dataset:
1854
+ type: mteb/stackexchange-clustering
1855
+ name: MTEB StackExchangeClustering
1856
+ config: default
1857
+ split: test
1858
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
1859
+ metrics:
1860
+ - type: v_measure
1861
+ value: 79.90649023801956
1862
+ - task:
1863
+ type: Clustering
1864
+ dataset:
1865
+ type: mteb/stackexchange-clustering-p2p
1866
+ name: MTEB StackExchangeClusteringP2P
1867
+ config: default
1868
+ split: test
1869
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
1870
+ metrics:
1871
+ - type: v_measure
1872
+ value: 49.681864218959205
1873
+ - task:
1874
+ type: Reranking
1875
+ dataset:
1876
+ type: mteb/stackoverflowdupquestions-reranking
1877
+ name: MTEB StackOverflowDupQuestions
1878
+ config: default
1879
+ split: test
1880
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
1881
+ metrics:
1882
+ - type: map
1883
+ value: 55.89272881949486
1884
+ - type: mrr
1885
+ value: 56.88128660555132
1886
+ - task:
1887
+ type: Summarization
1888
+ dataset:
1889
+ type: mteb/summeval
1890
+ name: MTEB SummEval
1891
+ config: default
1892
+ split: test
1893
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
1894
+ metrics:
1895
+ - type: cos_sim_pearson
1896
+ value: 31.945233723225954
1897
+ - type: cos_sim_spearman
1898
+ value: 31.361651389713284
1899
+ - type: dot_pearson
1900
+ value: 31.96193321438737
1901
+ - type: dot_spearman
1902
+ value: 31.37045148053791
1903
+ - task:
1904
+ type: Retrieval
1905
+ dataset:
1906
+ type: mteb/trec-covid
1907
+ name: MTEB TRECCOVID
1908
+ config: default
1909
+ split: test
1910
+ revision: None
1911
+ metrics:
1912
+ - type: map_at_1
1913
+ value: 0.244
1914
+ - type: map_at_10
1915
+ value: 2.011
1916
+ - type: map_at_100
1917
+ value: 12.555
1918
+ - type: map_at_1000
1919
+ value: 30.386000000000003
1920
+ - type: map_at_3
1921
+ value: 0.718
1922
+ - type: map_at_5
1923
+ value: 1.118
1924
+ - type: mrr_at_1
1925
+ value: 94.0
1926
+ - type: mrr_at_10
1927
+ value: 97.0
1928
+ - type: mrr_at_100
1929
+ value: 97.0
1930
+ - type: mrr_at_1000
1931
+ value: 97.0
1932
+ - type: mrr_at_3
1933
+ value: 97.0
1934
+ - type: mrr_at_5
1935
+ value: 97.0
1936
+ - type: ndcg_at_1
1937
+ value: 93.0
1938
+ - type: ndcg_at_10
1939
+ value: 81.612
1940
+ - type: ndcg_at_100
1941
+ value: 63.468
1942
+ - type: ndcg_at_1000
1943
+ value: 56.508
1944
+ - type: ndcg_at_3
1945
+ value: 88.81599999999999
1946
+ - type: ndcg_at_5
1947
+ value: 85.599
1948
+ - type: precision_at_1
1949
+ value: 94.0
1950
+ - type: precision_at_10
1951
+ value: 84.0
1952
+ - type: precision_at_100
1953
+ value: 65.18
1954
+ - type: precision_at_1000
1955
+ value: 24.758
1956
+ - type: precision_at_3
1957
+ value: 93.333
1958
+ - type: precision_at_5
1959
+ value: 89.2
1960
+ - type: recall_at_1
1961
+ value: 0.244
1962
+ - type: recall_at_10
1963
+ value: 2.161
1964
+ - type: recall_at_100
1965
+ value: 15.862000000000002
1966
+ - type: recall_at_1000
1967
+ value: 53.146
1968
+ - type: recall_at_3
1969
+ value: 0.738
1970
+ - type: recall_at_5
1971
+ value: 1.167
1972
+ - task:
1973
+ type: Classification
1974
+ dataset:
1975
+ type: mteb/toxic_conversations_50k
1976
+ name: MTEB ToxicConversationsClassification
1977
+ config: default
1978
+ split: test
1979
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
1980
+ metrics:
1981
+ - type: accuracy
1982
+ value: 82.948
1983
+ - type: ap
1984
+ value: 26.37282466987438
1985
+ - type: f1
1986
+ value: 66.9868680256644
1987
+ - task:
1988
+ type: Classification
1989
+ dataset:
1990
+ type: mteb/tweet_sentiment_extraction
1991
+ name: MTEB TweetSentimentExtractionClassification
1992
+ config: default
1993
+ split: test
1994
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
1995
+ metrics:
1996
+ - type: accuracy
1997
+ value: 73.78607809847199
1998
+ - type: f1
1999
+ value: 74.1324659804999
2000
+ - task:
2001
+ type: Clustering
2002
+ dataset:
2003
+ type: mteb/twentynewsgroups-clustering
2004
+ name: MTEB TwentyNewsgroupsClustering
2005
+ config: default
2006
+ split: test
2007
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2008
+ metrics:
2009
+ - type: v_measure
2010
+ value: 54.11838832136805
2011
+ - task:
2012
+ type: PairClassification
2013
+ dataset:
2014
+ type: mteb/twittersemeval2015-pairclassification
2015
+ name: MTEB TwitterSemEval2015
2016
+ config: default
2017
+ split: test
2018
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2019
+ metrics:
2020
+ - type: cos_sim_accuracy
2021
+ value: 87.64975859808071
2022
+ - type: cos_sim_ap
2023
+ value: 79.0918389936708
2024
+ - type: cos_sim_f1
2025
+ value: 72.18518052585232
2026
+ - type: cos_sim_precision
2027
+ value: 68.98292858860303
2028
+ - type: cos_sim_recall
2029
+ value: 75.69920844327177
2030
+ - type: dot_accuracy
2031
+ value: 87.64379805686356
2032
+ - type: dot_ap
2033
+ value: 79.09373814934631
2034
+ - type: dot_f1
2035
+ value: 72.18216318785579
2036
+ - type: dot_precision
2037
+ value: 69.33171324422844
2038
+ - type: dot_recall
2039
+ value: 75.27704485488127
2040
+ - type: euclidean_accuracy
2041
+ value: 87.64975859808071
2042
+ - type: euclidean_ap
2043
+ value: 79.09199976607417
2044
+ - type: euclidean_f1
2045
+ value: 72.17610062893083
2046
+ - type: euclidean_precision
2047
+ value: 68.96634615384616
2048
+ - type: euclidean_recall
2049
+ value: 75.69920844327177
2050
+ - type: manhattan_accuracy
2051
+ value: 87.61399535077786
2052
+ - type: manhattan_ap
2053
+ value: 78.91167634954901
2054
+ - type: manhattan_f1
2055
+ value: 72.0995176440721
2056
+ - type: manhattan_precision
2057
+ value: 69.47162426614481
2058
+ - type: manhattan_recall
2059
+ value: 74.93403693931398
2060
+ - type: max_accuracy
2061
+ value: 87.64975859808071
2062
+ - type: max_ap
2063
+ value: 79.09373814934631
2064
+ - type: max_f1
2065
+ value: 72.18518052585232
2066
+ - task:
2067
+ type: PairClassification
2068
+ dataset:
2069
+ type: mteb/twitterurlcorpus-pairclassification
2070
+ name: MTEB TwitterURLCorpus
2071
+ config: default
2072
+ split: test
2073
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2074
+ metrics:
2075
+ - type: cos_sim_accuracy
2076
+ value: 89.43415997205729
2077
+ - type: cos_sim_ap
2078
+ value: 86.69200523144308
2079
+ - type: cos_sim_f1
2080
+ value: 79.16424418604652
2081
+ - type: cos_sim_precision
2082
+ value: 74.95871180842279
2083
+ - type: cos_sim_recall
2084
+ value: 83.86972590083154
2085
+ - type: dot_accuracy
2086
+ value: 89.43415997205729
2087
+ - type: dot_ap
2088
+ value: 86.69346224233253
2089
+ - type: dot_f1
2090
+ value: 79.15884340968833
2091
+ - type: dot_precision
2092
+ value: 77.26139862190294
2093
+ - type: dot_recall
2094
+ value: 81.15183246073299
2095
+ - type: euclidean_accuracy
2096
+ value: 89.43221950556915
2097
+ - type: euclidean_ap
2098
+ value: 86.69176407206174
2099
+ - type: euclidean_f1
2100
+ value: 79.16409231328366
2101
+ - type: euclidean_precision
2102
+ value: 74.97074413161698
2103
+ - type: euclidean_recall
2104
+ value: 83.85432707114259
2105
+ - type: manhattan_accuracy
2106
+ value: 89.49237396670159
2107
+ - type: manhattan_ap
2108
+ value: 86.72274876446832
2109
+ - type: manhattan_f1
2110
+ value: 79.18286510672633
2111
+ - type: manhattan_precision
2112
+ value: 75.6058271466592
2113
+ - type: manhattan_recall
2114
+ value: 83.1151832460733
2115
+ - type: max_accuracy
2116
+ value: 89.49237396670159
2117
+ - type: max_ap
2118
+ value: 86.72274876446832
2119
+ - type: max_f1
2120
+ value: 79.18286510672633
2121
+ - task:
2122
+ type: STS
2123
+ dataset:
2124
+ type: C-MTEB/AFQMC
2125
+ name: MTEB AFQMC
2126
+ config: default
2127
+ split: validation
2128
+ revision: b44c3b011063adb25877c13823db83bb193913c4
2129
+ metrics:
2130
+ - type: cos_sim_pearson
2131
+ value: 65.7103214280117
2132
+ - type: cos_sim_spearman
2133
+ value: 72.62249544256886
2134
+ - type: euclidean_pearson
2135
+ value: 71.36812167041296
2136
+ - type: euclidean_spearman
2137
+ value: 72.62325941111307
2138
+ - type: manhattan_pearson
2139
+ value: 71.25613851615468
2140
+ - type: manhattan_spearman
2141
+ value: 72.54244015155267
2142
+ - task:
2143
+ type: STS
2144
+ dataset:
2145
+ type: C-MTEB/ATEC
2146
+ name: MTEB ATEC
2147
+ config: default
2148
+ split: test
2149
+ revision: 0f319b1142f28d00e055a6770f3f726ae9b7d865
2150
+ metrics:
2151
+ - type: cos_sim_pearson
2152
+ value: 59.903467713912974
2153
+ - type: cos_sim_spearman
2154
+ value: 62.8205444560593
2155
+ - type: euclidean_pearson
2156
+ value: 67.06329904158285
2157
+ - type: euclidean_spearman
2158
+ value: 62.82051743557576
2159
+ - type: manhattan_pearson
2160
+ value: 66.97943759454319
2161
+ - type: manhattan_spearman
2162
+ value: 62.763028353169325
2163
+ - task:
2164
+ type: Classification
2165
+ dataset:
2166
+ type: mteb/amazon_reviews_multi
2167
+ name: MTEB AmazonReviewsClassification (zh)
2168
+ config: zh
2169
+ split: test
2170
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
2171
+ metrics:
2172
+ - type: accuracy
2173
+ value: 53.57399999999999
2174
+ - type: f1
2175
+ value: 50.57496370390049
2176
+ - task:
2177
+ type: STS
2178
+ dataset:
2179
+ type: C-MTEB/BQ
2180
+ name: MTEB BQ
2181
+ config: default
2182
+ split: test
2183
+ revision: e3dda5e115e487b39ec7e618c0c6a29137052a55
2184
+ metrics:
2185
+ - type: cos_sim_pearson
2186
+ value: 79.09488668095824
2187
+ - type: cos_sim_spearman
2188
+ value: 81.34731850197655
2189
+ - type: euclidean_pearson
2190
+ value: 82.19030116395511
2191
+ - type: euclidean_spearman
2192
+ value: 81.34699287691117
2193
+ - type: manhattan_pearson
2194
+ value: 82.19510202220734
2195
+ - type: manhattan_spearman
2196
+ value: 81.35888167395795
2197
+ - task:
2198
+ type: Clustering
2199
+ dataset:
2200
+ type: C-MTEB/CLSClusteringP2P
2201
+ name: MTEB CLSClusteringP2P
2202
+ config: default
2203
+ split: test
2204
+ revision: 4b6227591c6c1a73bc76b1055f3b7f3588e72476
2205
+ metrics:
2206
+ - type: v_measure
2207
+ value: 48.60079470735067
2208
+ - task:
2209
+ type: Clustering
2210
+ dataset:
2211
+ type: C-MTEB/CLSClusteringS2S
2212
+ name: MTEB CLSClusteringS2S
2213
+ config: default
2214
+ split: test
2215
+ revision: e458b3f5414b62b7f9f83499ac1f5497ae2e869f
2216
+ metrics:
2217
+ - type: v_measure
2218
+ value: 46.125672623152155
2219
+ - task:
2220
+ type: Reranking
2221
+ dataset:
2222
+ type: C-MTEB/CMedQAv1-reranking
2223
+ name: MTEB CMedQAv1
2224
+ config: default
2225
+ split: test
2226
+ revision: 8d7f1e942507dac42dc58017c1a001c3717da7df
2227
+ metrics:
2228
+ - type: map
2229
+ value: 88.0714642862605
2230
+ - type: mrr
2231
+ value: 90.17428571428572
2232
+ - task:
2233
+ type: Reranking
2234
+ dataset:
2235
+ type: C-MTEB/CMedQAv2-reranking
2236
+ name: MTEB CMedQAv2
2237
+ config: default
2238
+ split: test
2239
+ revision: 23d186750531a14a0357ca22cd92d712fd512ea0
2240
+ metrics:
2241
+ - type: map
2242
+ value: 88.51263170426526
2243
+ - type: mrr
2244
+ value: 90.53325396825396
2245
+ - task:
2246
+ type: Retrieval
2247
+ dataset:
2248
+ type: C-MTEB/CmedqaRetrieval
2249
+ name: MTEB CmedqaRetrieval
2250
+ config: default
2251
+ split: dev
2252
+ revision: cd540c506dae1cf9e9a59c3e06f42030d54e7301
2253
+ metrics:
2254
+ - type: map_at_1
2255
+ value: 29.610999999999997
2256
+ - type: map_at_10
2257
+ value: 42.832
2258
+ - type: map_at_100
2259
+ value: 44.639
2260
+ - type: map_at_1000
2261
+ value: 44.738
2262
+ - type: map_at_3
2263
+ value: 38.549
2264
+ - type: map_at_5
2265
+ value: 40.905
2266
+ - type: mrr_at_1
2267
+ value: 44.461
2268
+ - type: mrr_at_10
2269
+ value: 52.274
2270
+ - type: mrr_at_100
2271
+ value: 53.179
2272
+ - type: mrr_at_1000
2273
+ value: 53.213
2274
+ - type: mrr_at_3
2275
+ value: 49.917
2276
+ - type: mrr_at_5
2277
+ value: 51.13799999999999
2278
+ - type: ndcg_at_1
2279
+ value: 44.461
2280
+ - type: ndcg_at_10
2281
+ value: 49.557
2282
+ - type: ndcg_at_100
2283
+ value: 56.432
2284
+ - type: ndcg_at_1000
2285
+ value: 58.050000000000004
2286
+ - type: ndcg_at_3
2287
+ value: 44.419
2288
+ - type: ndcg_at_5
2289
+ value: 46.386
2290
+ - type: precision_at_1
2291
+ value: 44.461
2292
+ - type: precision_at_10
2293
+ value: 10.673
2294
+ - type: precision_at_100
2295
+ value: 1.6310000000000002
2296
+ - type: precision_at_1000
2297
+ value: 0.184
2298
+ - type: precision_at_3
2299
+ value: 24.656
2300
+ - type: precision_at_5
2301
+ value: 17.619
2302
+ - type: recall_at_1
2303
+ value: 29.610999999999997
2304
+ - type: recall_at_10
2305
+ value: 60.112
2306
+ - type: recall_at_100
2307
+ value: 88.346
2308
+ - type: recall_at_1000
2309
+ value: 98.993
2310
+ - type: recall_at_3
2311
+ value: 44.243
2312
+ - type: recall_at_5
2313
+ value: 50.64300000000001
2314
+ - task:
2315
+ type: PairClassification
2316
+ dataset:
2317
+ type: C-MTEB/CMNLI
2318
+ name: MTEB Cmnli
2319
+ config: default
2320
+ split: validation
2321
+ revision: 41bc36f332156f7adc9e38f53777c959b2ae9766
2322
+ metrics:
2323
+ - type: cos_sim_accuracy
2324
+ value: 82.17678893565845
2325
+ - type: cos_sim_ap
2326
+ value: 89.77899888165327
2327
+ - type: cos_sim_f1
2328
+ value: 83.03306727480046
2329
+ - type: cos_sim_precision
2330
+ value: 81.0371689294458
2331
+ - type: cos_sim_recall
2332
+ value: 85.1297638531681
2333
+ - type: dot_accuracy
2334
+ value: 82.1647624774504
2335
+ - type: dot_ap
2336
+ value: 89.78074283382892
2337
+ - type: dot_f1
2338
+ value: 83.03306727480046
2339
+ - type: dot_precision
2340
+ value: 81.0371689294458
2341
+ - type: dot_recall
2342
+ value: 85.1297638531681
2343
+ - type: euclidean_accuracy
2344
+ value: 82.1888153938665
2345
+ - type: euclidean_ap
2346
+ value: 89.77917362529757
2347
+ - type: euclidean_f1
2348
+ value: 83.03306727480046
2349
+ - type: euclidean_precision
2350
+ value: 81.0371689294458
2351
+ - type: euclidean_recall
2352
+ value: 85.1297638531681
2353
+ - type: manhattan_accuracy
2354
+ value: 81.82802164762477
2355
+ - type: manhattan_ap
2356
+ value: 89.56708721584408
2357
+ - type: manhattan_f1
2358
+ value: 82.72179938657275
2359
+ - type: manhattan_precision
2360
+ value: 80.44631020768891
2361
+ - type: manhattan_recall
2362
+ value: 85.1297638531681
2363
+ - type: max_accuracy
2364
+ value: 82.1888153938665
2365
+ - type: max_ap
2366
+ value: 89.78074283382892
2367
+ - type: max_f1
2368
+ value: 83.03306727480046
2369
+ - task:
2370
+ type: Retrieval
2371
+ dataset:
2372
+ type: C-MTEB/CovidRetrieval
2373
+ name: MTEB CovidRetrieval
2374
+ config: default
2375
+ split: dev
2376
+ revision: 1271c7809071a13532e05f25fb53511ffce77117
2377
+ metrics:
2378
+ - type: map_at_1
2379
+ value: 66.807
2380
+ - type: map_at_10
2381
+ value: 75.47399999999999
2382
+ - type: map_at_100
2383
+ value: 75.837
2384
+ - type: map_at_1000
2385
+ value: 75.84
2386
+ - type: map_at_3
2387
+ value: 73.67399999999999
2388
+ - type: map_at_5
2389
+ value: 74.558
2390
+ - type: mrr_at_1
2391
+ value: 66.913
2392
+ - type: mrr_at_10
2393
+ value: 75.467
2394
+ - type: mrr_at_100
2395
+ value: 75.823
2396
+ - type: mrr_at_1000
2397
+ value: 75.82600000000001
2398
+ - type: mrr_at_3
2399
+ value: 73.67399999999999
2400
+ - type: mrr_at_5
2401
+ value: 74.586
2402
+ - type: ndcg_at_1
2403
+ value: 66.913
2404
+ - type: ndcg_at_10
2405
+ value: 79.591
2406
+ - type: ndcg_at_100
2407
+ value: 81.15
2408
+ - type: ndcg_at_1000
2409
+ value: 81.229
2410
+ - type: ndcg_at_3
2411
+ value: 75.83800000000001
2412
+ - type: ndcg_at_5
2413
+ value: 77.45
2414
+ - type: precision_at_1
2415
+ value: 66.913
2416
+ - type: precision_at_10
2417
+ value: 9.325999999999999
2418
+ - type: precision_at_100
2419
+ value: 1.0030000000000001
2420
+ - type: precision_at_1000
2421
+ value: 0.101
2422
+ - type: precision_at_3
2423
+ value: 27.432000000000002
2424
+ - type: precision_at_5
2425
+ value: 17.281
2426
+ - type: recall_at_1
2427
+ value: 66.807
2428
+ - type: recall_at_10
2429
+ value: 92.46600000000001
2430
+ - type: recall_at_100
2431
+ value: 99.262
2432
+ - type: recall_at_1000
2433
+ value: 99.895
2434
+ - type: recall_at_3
2435
+ value: 82.086
2436
+ - type: recall_at_5
2437
+ value: 85.985
2438
+ - task:
2439
+ type: Retrieval
2440
+ dataset:
2441
+ type: C-MTEB/DuRetrieval
2442
+ name: MTEB DuRetrieval
2443
+ config: default
2444
+ split: dev
2445
+ revision: a1a333e290fe30b10f3f56498e3a0d911a693ced
2446
+ metrics:
2447
+ - type: map_at_1
2448
+ value: 26.599
2449
+ - type: map_at_10
2450
+ value: 81.577
2451
+ - type: map_at_100
2452
+ value: 84.368
2453
+ - type: map_at_1000
2454
+ value: 84.39999999999999
2455
+ - type: map_at_3
2456
+ value: 56.825
2457
+ - type: map_at_5
2458
+ value: 71.462
2459
+ - type: mrr_at_1
2460
+ value: 90.5
2461
+ - type: mrr_at_10
2462
+ value: 93.798
2463
+ - type: mrr_at_100
2464
+ value: 93.851
2465
+ - type: mrr_at_1000
2466
+ value: 93.853
2467
+ - type: mrr_at_3
2468
+ value: 93.5
2469
+ - type: mrr_at_5
2470
+ value: 93.672
2471
+ - type: ndcg_at_1
2472
+ value: 90.5
2473
+ - type: ndcg_at_10
2474
+ value: 88.633
2475
+ - type: ndcg_at_100
2476
+ value: 91.217
2477
+ - type: ndcg_at_1000
2478
+ value: 91.484
2479
+ - type: ndcg_at_3
2480
+ value: 87.29599999999999
2481
+ - type: ndcg_at_5
2482
+ value: 86.31299999999999
2483
+ - type: precision_at_1
2484
+ value: 90.5
2485
+ - type: precision_at_10
2486
+ value: 42.18
2487
+ - type: precision_at_100
2488
+ value: 4.839
2489
+ - type: precision_at_1000
2490
+ value: 0.49
2491
+ - type: precision_at_3
2492
+ value: 78.133
2493
+ - type: precision_at_5
2494
+ value: 65.82000000000001
2495
+ - type: recall_at_1
2496
+ value: 26.599
2497
+ - type: recall_at_10
2498
+ value: 90.137
2499
+ - type: recall_at_100
2500
+ value: 98.393
2501
+ - type: recall_at_1000
2502
+ value: 99.747
2503
+ - type: recall_at_3
2504
+ value: 59.199999999999996
2505
+ - type: recall_at_5
2506
+ value: 76.173
2507
+ - task:
2508
+ type: Retrieval
2509
+ dataset:
2510
+ type: C-MTEB/EcomRetrieval
2511
+ name: MTEB EcomRetrieval
2512
+ config: default
2513
+ split: dev
2514
+ revision: 687de13dc7294d6fd9be10c6945f9e8fec8166b9
2515
+ metrics:
2516
+ - type: map_at_1
2517
+ value: 55.2
2518
+ - type: map_at_10
2519
+ value: 64.925
2520
+ - type: map_at_100
2521
+ value: 65.446
2522
+ - type: map_at_1000
2523
+ value: 65.459
2524
+ - type: map_at_3
2525
+ value: 62.266999999999996
2526
+ - type: map_at_5
2527
+ value: 64.107
2528
+ - type: mrr_at_1
2529
+ value: 55.2
2530
+ - type: mrr_at_10
2531
+ value: 64.925
2532
+ - type: mrr_at_100
2533
+ value: 65.446
2534
+ - type: mrr_at_1000
2535
+ value: 65.459
2536
+ - type: mrr_at_3
2537
+ value: 62.266999999999996
2538
+ - type: mrr_at_5
2539
+ value: 64.107
2540
+ - type: ndcg_at_1
2541
+ value: 55.2
2542
+ - type: ndcg_at_10
2543
+ value: 69.85900000000001
2544
+ - type: ndcg_at_100
2545
+ value: 72.194
2546
+ - type: ndcg_at_1000
2547
+ value: 72.506
2548
+ - type: ndcg_at_3
2549
+ value: 64.538
2550
+ - type: ndcg_at_5
2551
+ value: 67.843
2552
+ - type: precision_at_1
2553
+ value: 55.2
2554
+ - type: precision_at_10
2555
+ value: 8.540000000000001
2556
+ - type: precision_at_100
2557
+ value: 0.959
2558
+ - type: precision_at_1000
2559
+ value: 0.098
2560
+ - type: precision_at_3
2561
+ value: 23.7
2562
+ - type: precision_at_5
2563
+ value: 15.82
2564
+ - type: recall_at_1
2565
+ value: 55.2
2566
+ - type: recall_at_10
2567
+ value: 85.39999999999999
2568
+ - type: recall_at_100
2569
+ value: 95.89999999999999
2570
+ - type: recall_at_1000
2571
+ value: 98.3
2572
+ - type: recall_at_3
2573
+ value: 71.1
2574
+ - type: recall_at_5
2575
+ value: 79.10000000000001
2576
+ - task:
2577
+ type: Classification
2578
+ dataset:
2579
+ type: C-MTEB/IFlyTek-classification
2580
+ name: MTEB IFlyTek
2581
+ config: default
2582
+ split: validation
2583
+ revision: 421605374b29664c5fc098418fe20ada9bd55f8a
2584
+ metrics:
2585
+ - type: accuracy
2586
+ value: 53.92843401308196
2587
+ - type: f1
2588
+ value: 40.44614048360205
2589
+ - task:
2590
+ type: Classification
2591
+ dataset:
2592
+ type: C-MTEB/JDReview-classification
2593
+ name: MTEB JDReview
2594
+ config: default
2595
+ split: test
2596
+ revision: b7c64bd89eb87f8ded463478346f76731f07bf8b
2597
+ metrics:
2598
+ - type: accuracy
2599
+ value: 86.22889305816133
2600
+ - type: ap
2601
+ value: 55.542660925360835
2602
+ - type: f1
2603
+ value: 81.26964576055315
2604
+ - task:
2605
+ type: STS
2606
+ dataset:
2607
+ type: C-MTEB/LCQMC
2608
+ name: MTEB LCQMC
2609
+ config: default
2610
+ split: test
2611
+ revision: 17f9b096f80380fce5ed12a9be8be7784b337daf
2612
+ metrics:
2613
+ - type: cos_sim_pearson
2614
+ value: 68.50234587951512
2615
+ - type: cos_sim_spearman
2616
+ value: 73.04229322574785
2617
+ - type: euclidean_pearson
2618
+ value: 71.76475440799503
2619
+ - type: euclidean_spearman
2620
+ value: 73.04203161533454
2621
+ - type: manhattan_pearson
2622
+ value: 71.75530397681868
2623
+ - type: manhattan_spearman
2624
+ value: 73.01054099221574
2625
+ - task:
2626
+ type: Reranking
2627
+ dataset:
2628
+ type: C-MTEB/Mmarco-reranking
2629
+ name: MTEB MMarcoReranking
2630
+ config: default
2631
+ split: dev
2632
+ revision: None
2633
+ metrics:
2634
+ - type: map
2635
+ value: 22.67056873798454
2636
+ - type: mrr
2637
+ value: 21.63888888888889
2638
+ - task:
2639
+ type: Retrieval
2640
+ dataset:
2641
+ type: C-MTEB/MMarcoRetrieval
2642
+ name: MTEB MMarcoRetrieval
2643
+ config: default
2644
+ split: dev
2645
+ revision: 539bbde593d947e2a124ba72651aafc09eb33fc2
2646
+ metrics:
2647
+ - type: map_at_1
2648
+ value: 67.65
2649
+ - type: map_at_10
2650
+ value: 76.726
2651
+ - type: map_at_100
2652
+ value: 77.03
2653
+ - type: map_at_1000
2654
+ value: 77.042
2655
+ - type: map_at_3
2656
+ value: 74.924
2657
+ - type: map_at_5
2658
+ value: 76.08200000000001
2659
+ - type: mrr_at_1
2660
+ value: 69.87100000000001
2661
+ - type: mrr_at_10
2662
+ value: 77.238
2663
+ - type: mrr_at_100
2664
+ value: 77.492
2665
+ - type: mrr_at_1000
2666
+ value: 77.503
2667
+ - type: mrr_at_3
2668
+ value: 75.633
2669
+ - type: mrr_at_5
2670
+ value: 76.678
2671
+ - type: ndcg_at_1
2672
+ value: 69.87100000000001
2673
+ - type: ndcg_at_10
2674
+ value: 80.37100000000001
2675
+ - type: ndcg_at_100
2676
+ value: 81.658
2677
+ - type: ndcg_at_1000
2678
+ value: 81.94200000000001
2679
+ - type: ndcg_at_3
2680
+ value: 76.94
2681
+ - type: ndcg_at_5
2682
+ value: 78.926
2683
+ - type: precision_at_1
2684
+ value: 69.87100000000001
2685
+ - type: precision_at_10
2686
+ value: 9.681
2687
+ - type: precision_at_100
2688
+ value: 1.032
2689
+ - type: precision_at_1000
2690
+ value: 0.105
2691
+ - type: precision_at_3
2692
+ value: 28.906
2693
+ - type: precision_at_5
2694
+ value: 18.404
2695
+ - type: recall_at_1
2696
+ value: 67.65
2697
+ - type: recall_at_10
2698
+ value: 91.078
2699
+ - type: recall_at_100
2700
+ value: 96.767
2701
+ - type: recall_at_1000
2702
+ value: 98.933
2703
+ - type: recall_at_3
2704
+ value: 82.02000000000001
2705
+ - type: recall_at_5
2706
+ value: 86.771
2707
+ - task:
2708
+ type: Classification
2709
+ dataset:
2710
+ type: mteb/amazon_massive_intent
2711
+ name: MTEB MassiveIntentClassification (zh-CN)
2712
+ config: zh-CN
2713
+ split: test
2714
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
2715
+ metrics:
2716
+ - type: accuracy
2717
+ value: 79.7848016139879
2718
+ - type: f1
2719
+ value: 76.99189501152489
2720
+ - task:
2721
+ type: Classification
2722
+ dataset:
2723
+ type: mteb/amazon_massive_scenario
2724
+ name: MTEB MassiveScenarioClassification (zh-CN)
2725
+ config: zh-CN
2726
+ split: test
2727
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
2728
+ metrics:
2729
+ - type: accuracy
2730
+ value: 83.64492266308001
2731
+ - type: f1
2732
+ value: 82.84955852311293
2733
+ - task:
2734
+ type: Retrieval
2735
+ dataset:
2736
+ type: C-MTEB/MedicalRetrieval
2737
+ name: MTEB MedicalRetrieval
2738
+ config: default
2739
+ split: dev
2740
+ revision: 2039188fb5800a9803ba5048df7b76e6fb151fc6
2741
+ metrics:
2742
+ - type: map_at_1
2743
+ value: 54.400000000000006
2744
+ - type: map_at_10
2745
+ value: 60.529999999999994
2746
+ - type: map_at_100
2747
+ value: 61.114999999999995
2748
+ - type: map_at_1000
2749
+ value: 61.153999999999996
2750
+ - type: map_at_3
2751
+ value: 59.150000000000006
2752
+ - type: map_at_5
2753
+ value: 59.955000000000005
2754
+ - type: mrr_at_1
2755
+ value: 54.50000000000001
2756
+ - type: mrr_at_10
2757
+ value: 60.58
2758
+ - type: mrr_at_100
2759
+ value: 61.165000000000006
2760
+ - type: mrr_at_1000
2761
+ value: 61.204
2762
+ - type: mrr_at_3
2763
+ value: 59.199999999999996
2764
+ - type: mrr_at_5
2765
+ value: 60.004999999999995
2766
+ - type: ndcg_at_1
2767
+ value: 54.400000000000006
2768
+ - type: ndcg_at_10
2769
+ value: 63.522999999999996
2770
+ - type: ndcg_at_100
2771
+ value: 66.742
2772
+ - type: ndcg_at_1000
2773
+ value: 67.818
2774
+ - type: ndcg_at_3
2775
+ value: 60.702999999999996
2776
+ - type: ndcg_at_5
2777
+ value: 62.149
2778
+ - type: precision_at_1
2779
+ value: 54.400000000000006
2780
+ - type: precision_at_10
2781
+ value: 7.290000000000001
2782
+ - type: precision_at_100
2783
+ value: 0.8880000000000001
2784
+ - type: precision_at_1000
2785
+ value: 0.097
2786
+ - type: precision_at_3
2787
+ value: 21.733
2788
+ - type: precision_at_5
2789
+ value: 13.74
2790
+ - type: recall_at_1
2791
+ value: 54.400000000000006
2792
+ - type: recall_at_10
2793
+ value: 72.89999999999999
2794
+ - type: recall_at_100
2795
+ value: 88.8
2796
+ - type: recall_at_1000
2797
+ value: 97.39999999999999
2798
+ - type: recall_at_3
2799
+ value: 65.2
2800
+ - type: recall_at_5
2801
+ value: 68.7
2802
+ - task:
2803
+ type: Classification
2804
+ dataset:
2805
+ type: C-MTEB/MultilingualSentiment-classification
2806
+ name: MTEB MultilingualSentiment
2807
+ config: default
2808
+ split: validation
2809
+ revision: 46958b007a63fdbf239b7672c25d0bea67b5ea1a
2810
+ metrics:
2811
+ - type: accuracy
2812
+ value: 77.16000000000001
2813
+ - type: f1
2814
+ value: 76.97953105264186
2815
+ - task:
2816
+ type: PairClassification
2817
+ dataset:
2818
+ type: C-MTEB/OCNLI
2819
+ name: MTEB Ocnli
2820
+ config: default
2821
+ split: validation
2822
+ revision: 66e76a618a34d6d565d5538088562851e6daa7ec
2823
+ metrics:
2824
+ - type: cos_sim_accuracy
2825
+ value: 79.48023822414727
2826
+ - type: cos_sim_ap
2827
+ value: 84.483894203704
2828
+ - type: cos_sim_f1
2829
+ value: 80.55130168453292
2830
+ - type: cos_sim_precision
2831
+ value: 77.96442687747036
2832
+ - type: cos_sim_recall
2833
+ value: 83.31573389651531
2834
+ - type: dot_accuracy
2835
+ value: 79.48023822414727
2836
+ - type: dot_ap
2837
+ value: 84.49261973641154
2838
+ - type: dot_f1
2839
+ value: 80.55130168453292
2840
+ - type: dot_precision
2841
+ value: 77.96442687747036
2842
+ - type: dot_recall
2843
+ value: 83.31573389651531
2844
+ - type: euclidean_accuracy
2845
+ value: 79.48023822414727
2846
+ - type: euclidean_ap
2847
+ value: 84.48068994534293
2848
+ - type: euclidean_f1
2849
+ value: 80.55130168453292
2850
+ - type: euclidean_precision
2851
+ value: 77.96442687747036
2852
+ - type: euclidean_recall
2853
+ value: 83.31573389651531
2854
+ - type: manhattan_accuracy
2855
+ value: 79.37195452084461
2856
+ - type: manhattan_ap
2857
+ value: 84.45931914984077
2858
+ - type: manhattan_f1
2859
+ value: 80.53142565150742
2860
+ - type: manhattan_precision
2861
+ value: 78.01980198019803
2862
+ - type: manhattan_recall
2863
+ value: 83.21013727560718
2864
+ - type: max_accuracy
2865
+ value: 79.48023822414727
2866
+ - type: max_ap
2867
+ value: 84.49261973641154
2868
+ - type: max_f1
2869
+ value: 80.55130168453292
2870
+ - task:
2871
+ type: Classification
2872
+ dataset:
2873
+ type: C-MTEB/OnlineShopping-classification
2874
+ name: MTEB OnlineShopping
2875
+ config: default
2876
+ split: test
2877
+ revision: e610f2ebd179a8fda30ae534c3878750a96db120
2878
+ metrics:
2879
+ - type: accuracy
2880
+ value: 94.3
2881
+ - type: ap
2882
+ value: 92.84324255663363
2883
+ - type: f1
2884
+ value: 94.29275233313747
2885
+ - task:
2886
+ type: STS
2887
+ dataset:
2888
+ type: C-MTEB/PAWSX
2889
+ name: MTEB PAWSX
2890
+ config: default
2891
+ split: test
2892
+ revision: 9c6a90e430ac22b5779fb019a23e820b11a8b5e1
2893
+ metrics:
2894
+ - type: cos_sim_pearson
2895
+ value: 50.25954594958544
2896
+ - type: cos_sim_spearman
2897
+ value: 55.1554675848278
2898
+ - type: euclidean_pearson
2899
+ value: 53.71113201288935
2900
+ - type: euclidean_spearman
2901
+ value: 55.1558156481826
2902
+ - type: manhattan_pearson
2903
+ value: 53.816355416293646
2904
+ - type: manhattan_spearman
2905
+ value: 55.14310001157623
2906
+ - task:
2907
+ type: STS
2908
+ dataset:
2909
+ type: C-MTEB/QBQTC
2910
+ name: MTEB QBQTC
2911
+ config: default
2912
+ split: test
2913
+ revision: 790b0510dc52b1553e8c49f3d2afb48c0e5c48b7
2914
+ metrics:
2915
+ - type: cos_sim_pearson
2916
+ value: 29.187751074660845
2917
+ - type: cos_sim_spearman
2918
+ value: 30.889291180505868
2919
+ - type: euclidean_pearson
2920
+ value: 28.73210543314964
2921
+ - type: euclidean_spearman
2922
+ value: 30.889662787316784
2923
+ - type: manhattan_pearson
2924
+ value: 29.21703764852649
2925
+ - type: manhattan_spearman
2926
+ value: 31.47317743982721
2927
+ - task:
2928
+ type: STS
2929
+ dataset:
2930
+ type: mteb/sts22-crosslingual-sts
2931
+ name: MTEB STS22 (zh)
2932
+ config: zh
2933
+ split: test
2934
+ revision: eea2b4fe26a775864c896887d910b76a8098ad3f
2935
+ metrics:
2936
+ - type: cos_sim_pearson
2937
+ value: 61.1898272680276
2938
+ - type: cos_sim_spearman
2939
+ value: 64.93927648503598
2940
+ - type: euclidean_pearson
2941
+ value: 61.11026474293018
2942
+ - type: euclidean_spearman
2943
+ value: 64.94229072933243
2944
+ - type: manhattan_pearson
2945
+ value: 62.19814132782434
2946
+ - type: manhattan_spearman
2947
+ value: 65.2583560877569
2948
+ - task:
2949
+ type: STS
2950
+ dataset:
2951
+ type: C-MTEB/STSB
2952
+ name: MTEB STSB
2953
+ config: default
2954
+ split: test
2955
+ revision: 0cde68302b3541bb8b3c340dc0644b0b745b3dc0
2956
+ metrics:
2957
+ - type: cos_sim_pearson
2958
+ value: 78.76122365246462
2959
+ - type: cos_sim_spearman
2960
+ value: 78.68211802616669
2961
+ - type: euclidean_pearson
2962
+ value: 77.17265704615994
2963
+ - type: euclidean_spearman
2964
+ value: 78.68087191872655
2965
+ - type: manhattan_pearson
2966
+ value: 77.61313585452194
2967
+ - type: manhattan_spearman
2968
+ value: 79.22153641729726
2969
+ - task:
2970
+ type: Reranking
2971
+ dataset:
2972
+ type: C-MTEB/T2Reranking
2973
+ name: MTEB T2Reranking
2974
+ config: default
2975
+ split: dev
2976
+ revision: 76631901a18387f85eaa53e5450019b87ad58ef9
2977
+ metrics:
2978
+ - type: map
2979
+ value: 67.80243119237458
2980
+ - type: mrr
2981
+ value: 78.00406497512118
2982
+ - task:
2983
+ type: Retrieval
2984
+ dataset:
2985
+ type: C-MTEB/T2Retrieval
2986
+ name: MTEB T2Retrieval
2987
+ config: default
2988
+ split: dev
2989
+ revision: 8731a845f1bf500a4f111cf1070785c793d10e64
2990
+ metrics:
2991
+ - type: map_at_1
2992
+ value: 28.936
2993
+ - type: map_at_10
2994
+ value: 82.256
2995
+ - type: map_at_100
2996
+ value: 85.688
2997
+ - type: map_at_1000
2998
+ value: 85.727
2999
+ - type: map_at_3
3000
+ value: 57.655
3001
+ - type: map_at_5
3002
+ value: 71.05
3003
+ - type: mrr_at_1
3004
+ value: 92.548
3005
+ - type: mrr_at_10
3006
+ value: 94.586
3007
+ - type: mrr_at_100
3008
+ value: 94.64399999999999
3009
+ - type: mrr_at_1000
3010
+ value: 94.646
3011
+ - type: mrr_at_3
3012
+ value: 94.255
3013
+ - type: mrr_at_5
3014
+ value: 94.464
3015
+ - type: ndcg_at_1
3016
+ value: 92.548
3017
+ - type: ndcg_at_10
3018
+ value: 88.74600000000001
3019
+ - type: ndcg_at_100
3020
+ value: 91.58500000000001
3021
+ - type: ndcg_at_1000
3022
+ value: 91.953
3023
+ - type: ndcg_at_3
3024
+ value: 89.578
3025
+ - type: ndcg_at_5
3026
+ value: 88.584
3027
+ - type: precision_at_1
3028
+ value: 92.548
3029
+ - type: precision_at_10
3030
+ value: 43.954
3031
+ - type: precision_at_100
3032
+ value: 5.099
3033
+ - type: precision_at_1000
3034
+ value: 0.518
3035
+ - type: precision_at_3
3036
+ value: 78.213
3037
+ - type: precision_at_5
3038
+ value: 65.839
3039
+ - type: recall_at_1
3040
+ value: 28.936
3041
+ - type: recall_at_10
3042
+ value: 87.869
3043
+ - type: recall_at_100
3044
+ value: 97.286
3045
+ - type: recall_at_1000
3046
+ value: 99.173
3047
+ - type: recall_at_3
3048
+ value: 59.157000000000004
3049
+ - type: recall_at_5
3050
+ value: 74.02499999999999
3051
+ - task:
3052
+ type: Classification
3053
+ dataset:
3054
+ type: C-MTEB/TNews-classification
3055
+ name: MTEB TNews
3056
+ config: default
3057
+ split: validation
3058
+ revision: 317f262bf1e6126357bbe89e875451e4b0938fe4
3059
+ metrics:
3060
+ - type: accuracy
3061
+ value: 53.269
3062
+ - type: f1
3063
+ value: 50.68236445411186
3064
+ - task:
3065
+ type: Clustering
3066
+ dataset:
3067
+ type: C-MTEB/ThuNewsClusteringP2P
3068
+ name: MTEB ThuNewsClusteringP2P
3069
+ config: default
3070
+ split: test
3071
+ revision: 5798586b105c0434e4f0fe5e767abe619442cf93
3072
+ metrics:
3073
+ - type: v_measure
3074
+ value: 86.47994658950259
3075
+ - task:
3076
+ type: Clustering
3077
+ dataset:
3078
+ type: C-MTEB/ThuNewsClusteringS2S
3079
+ name: MTEB ThuNewsClusteringS2S
3080
+ config: default
3081
+ split: test
3082
+ revision: 8a8b2caeda43f39e13c4bc5bea0f8a667896e10d
3083
+ metrics:
3084
+ - type: v_measure
3085
+ value: 85.34791895793325
3086
+ - task:
3087
+ type: Retrieval
3088
+ dataset:
3089
+ type: C-MTEB/VideoRetrieval
3090
+ name: MTEB VideoRetrieval
3091
+ config: default
3092
+ split: dev
3093
+ revision: 58c2597a5943a2ba48f4668c3b90d796283c5639
3094
+ metrics:
3095
+ - type: map_at_1
3096
+ value: 65.5
3097
+ - type: map_at_10
3098
+ value: 74.134
3099
+ - type: map_at_100
3100
+ value: 74.49799999999999
3101
+ - type: map_at_1000
3102
+ value: 74.509
3103
+ - type: map_at_3
3104
+ value: 72.467
3105
+ - type: map_at_5
3106
+ value: 73.462
3107
+ - type: mrr_at_1
3108
+ value: 65.5
3109
+ - type: mrr_at_10
3110
+ value: 74.134
3111
+ - type: mrr_at_100
3112
+ value: 74.49799999999999
3113
+ - type: mrr_at_1000
3114
+ value: 74.509
3115
+ - type: mrr_at_3
3116
+ value: 72.467
3117
+ - type: mrr_at_5
3118
+ value: 73.462
3119
+ - type: ndcg_at_1
3120
+ value: 65.5
3121
+ - type: ndcg_at_10
3122
+ value: 78.144
3123
+ - type: ndcg_at_100
3124
+ value: 79.726
3125
+ - type: ndcg_at_1000
3126
+ value: 79.97800000000001
3127
+ - type: ndcg_at_3
3128
+ value: 74.735
3129
+ - type: ndcg_at_5
3130
+ value: 76.55999999999999
3131
+ - type: precision_at_1
3132
+ value: 65.5
3133
+ - type: precision_at_10
3134
+ value: 9.06
3135
+ - type: precision_at_100
3136
+ value: 0.976
3137
+ - type: precision_at_1000
3138
+ value: 0.1
3139
+ - type: precision_at_3
3140
+ value: 27.1
3141
+ - type: precision_at_5
3142
+ value: 17.16
3143
+ - type: recall_at_1
3144
+ value: 65.5
3145
+ - type: recall_at_10
3146
+ value: 90.60000000000001
3147
+ - type: recall_at_100
3148
+ value: 97.6
3149
+ - type: recall_at_1000
3150
+ value: 99.5
3151
+ - type: recall_at_3
3152
+ value: 81.3
3153
+ - type: recall_at_5
3154
+ value: 85.8
3155
+ - task:
3156
+ type: Classification
3157
+ dataset:
3158
+ type: C-MTEB/waimai-classification
3159
+ name: MTEB Waimai
3160
+ config: default
3161
+ split: test
3162
+ revision: 339287def212450dcaa9df8c22bf93e9980c7023
3163
+ metrics:
3164
+ - type: accuracy
3165
+ value: 89.43999999999998
3166
+ - type: ap
3167
+ value: 75.53653890653014
3168
+ - type: f1
3169
+ value: 87.91597334503136
3170
+ ---
3171
+
3172
+ ## gte-Qwen2-7B-instruct
3173
+
3174
+ **gte-Qwen2-7B-instruct** is the latest model in the gte (General Text Embedding) model family.
3175
+
3176
+ Recently, the [**Qwen team**](https://huggingface.co/Qwen) released the Qwen2 series models, and we have trained the **gte-Qwen2-7B-instruct** model based on the [Qwen2-7B](https://huggingface.co/Qwen/Qwen2-7B) LLM model. Compared to the [gte-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) model, the **gte-Qwen2-7B-instruct** model uses the same training data and training strategies during the finetuning stage, with the only difference being the upgraded base model to Qwen2-7B. Considering the improvements in the Qwen2 series models compared to the Qwen1.5 series, we can also expect consistent performance enhancements in the embedding models.
3177
+
3178
+ The model incorporates several key advancements:
3179
+
3180
+ - Integration of bidirectional attention mechanisms, enriching its contextual understanding.
3181
+ - Instruction tuning, applied solely on the query side for streamlined efficiency
3182
+ - Comprehensive training across a vast, multilingual text corpus spanning diverse domains and scenarios. This training leverages both weakly supervised and supervised data, ensuring the model's applicability across numerous languages and a wide array of downstream tasks.
3183
+
3184
+
3185
+ ## Model Information
3186
+ - Model Size: 7B
3187
+ - Embedding Dimension: 4096
3188
+ - Max Input Tokens: 32k
3189
+
3190
+ ## Requirements
3191
+ ```
3192
+ transformers>=4.39.2
3193
+ flash_attn>=2.5.6
3194
+ ```
3195
+ ## Usage
3196
+
3197
+ ### Sentence Transformers
3198
+
3199
+ ```python
3200
+ from sentence_transformers import SentenceTransformer
3201
+
3202
+ model = SentenceTransformer("Alibaba-NLP/gte-Qwen2-7B-instruct", trust_remote_code=True)
3203
+ # In case you want to reduce the maximum length:
3204
+ model.max_seq_length = 8192
3205
+
3206
+ queries = [
3207
+ "how much protein should a female eat",
3208
+ "summit define",
3209
+ ]
3210
+ documents = [
3211
+ "As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
3212
+ "Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments.",
3213
+ ]
3214
+
3215
+ query_embeddings = model.encode(queries, prompt_name="query")
3216
+ document_embeddings = model.encode(documents)
3217
+
3218
+ scores = (query_embeddings @ document_embeddings.T) * 100
3219
+ print(scores.tolist())
3220
+ ```
3221
+
3222
+ Observe the [config_sentence_transformers.json](config_sentence_transformers.json) to see all pre-built prompt names. Otherwise, you can use `model.encode(queries, prompt="Instruct: ...\nQuery: "` to use a custom prompt of your choice.
3223
+
3224
+ ### Transformers
3225
+
3226
+ ```python
3227
+ import torch
3228
+ import torch.nn.functional as F
3229
+
3230
+ from torch import Tensor
3231
+ from transformers import AutoTokenizer, AutoModel
3232
+
3233
+
3234
+ def last_token_pool(last_hidden_states: Tensor,
3235
+ attention_mask: Tensor) -> Tensor:
3236
+ left_padding = (attention_mask[:, -1].sum() == attention_mask.shape[0])
3237
+ if left_padding:
3238
+ return last_hidden_states[:, -1]
3239
+ else:
3240
+ sequence_lengths = attention_mask.sum(dim=1) - 1
3241
+ batch_size = last_hidden_states.shape[0]
3242
+ return last_hidden_states[torch.arange(batch_size, device=last_hidden_states.device), sequence_lengths]
3243
+
3244
+
3245
+ def get_detailed_instruct(task_description: str, query: str) -> str:
3246
+ return f'Instruct: {task_description}\nQuery: {query}'
3247
+
3248
+
3249
+ # Each query must come with a one-sentence instruction that describes the task
3250
+ task = 'Given a web search query, retrieve relevant passages that answer the query'
3251
+ queries = [
3252
+ get_detailed_instruct(task, 'how much protein should a female eat'),
3253
+ get_detailed_instruct(task, 'summit define')
3254
+ ]
3255
+ # No need to add instruction for retrieval documents
3256
+ documents = [
3257
+ "As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
3258
+ "Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."
3259
+ ]
3260
+ input_texts = queries + documents
3261
+
3262
+ tokenizer = AutoTokenizer.from_pretrained('Alibaba-NLP/gte-Qwen2-7B-instruct', trust_remote_code=True)
3263
+ model = AutoModel.from_pretrained('Alibaba-NLP/gte-Qwen2-7B-instruct', trust_remote_code=True)
3264
+
3265
+ max_length = 8192
3266
+
3267
+ # Tokenize the input texts
3268
+ batch_dict = tokenizer(input_texts, max_length=max_length, padding=True, truncation=True, return_tensors='pt')
3269
+ outputs = model(**batch_dict)
3270
+ embeddings = last_token_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
3271
+
3272
+ # normalize embeddings
3273
+ embeddings = F.normalize(embeddings, p=2, dim=1)
3274
+ scores = (embeddings[:2] @ embeddings[2:].T) * 100
3275
+ print(scores.tolist())
3276
+ ```
3277
+
3278
+ ## Evaluation
3279
+
3280
+ ### MTEB & C-MTEB
3281
+
3282
+ You can use the [scripts/eval_mteb.py](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct/blob/main/scripts/eval_mteb.py) to reproduce the following result of **gte-Qwen2-7B-instruct** on MTEB(English)/C-MTEB(Chinese):
3283
+
3284
+ | Model Name | MTEB(56) | C-MTEB(35) |
3285
+ |:----:|:---------:|:----------:|
3286
+ | [bge-base-en-1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) | 64.23 | - |
3287
+ | [bge-large-en-1.5](https://huggingface.co/BAAI/bge-large-en-v1.5) | 63.55 | - |
3288
+ | [gte-large-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 65.39 | - |
3289
+ | [gte-base-en-v1.5](https://huggingface.co/Alibaba-NLP/gte-large-en-v1.5) | 64.11 | - |
3290
+ | [mxbai-embed-large-v1](https://huggingface.co/mixedbread-ai/mxbai-embed-large-v1) | 64.68 | - |
3291
+ | [acge_text_embedding](https://huggingface.co/aspire/acge_text_embedding) | - | 69.07 |
3292
+ | [stella-mrl-large-zh-v3.5-1792d](https://huggingface.co/infgrad/stella-mrl-large-zh-v3.5-1792d) | - | 68.55 |
3293
+ | [gte-large-zh](https://huggingface.co/thenlper/gte-large-zh) | - | 66.72 |
3294
+ | [multilingual-e5-base](https://huggingface.co/intfloat/multilingual-e5-base) | 59.45 | 56.21 |
3295
+ | [multilingual-e5-large](https://huggingface.co/intfloat/multilingual-e5-large) | 61.50 | 58.81 |
3296
+ | [e5-mistral-7b-instruct](https://huggingface.co/intfloat/e5-mistral-7b-instruct) | 66.63 | 60.81 |
3297
+ | [gte-Qwen1.5-7B-instruct](https://huggingface.co/Alibaba-NLP/gte-Qwen1.5-7B-instruct) | 67.34 | 69.52 |
3298
+ | [NV-Embed-v1](https://huggingface.co/nvidia/NV-Embed-v1) | 69.32 | - |
3299
+ | [**gte-Qwen2-7B-instruct**](https://huggingface.co/Alibaba-NLP/gte-Qwen2-7B-instruct) | **70.04** | **71.98** |
3300
+
3301
+ ### GTE Models
3302
+
3303
+ The gte series models have consistently released two types of models: encoder-only models (based on the BERT architecture) and decode-only models (based on the LLM architecture).
3304
+
3305
+ ## Citation
3306
+
3307
+ If you find our paper or models helpful, please consider cite:
3308
+
3309
+ ```
3310
+ @article{li2023towards,
3311
+ title={Towards general text embeddings with multi-stage contrastive learning},
3312
+ author={Li, Zehan and Zhang, Xin and Zhang, Yanzhao and Long, Dingkun and Xie, Pengjun and Zhang, Meishan},
3313
+ journal={arXiv preprint arXiv:2308.03281},
3314
+ year={2023}
3315
+ }
3316
+ ```