intfloat commited on
Commit
30f8119
1 Parent(s): 6eaa730

upload models

Browse files
README.md ADDED
@@ -0,0 +1,2656 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - mteb
4
+ model-index:
5
+ - name: e5-large
6
+ results:
7
+ - task:
8
+ type: Classification
9
+ dataset:
10
+ type: mteb/amazon_counterfactual
11
+ name: MTEB AmazonCounterfactualClassification (en)
12
+ config: en
13
+ split: test
14
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
15
+ metrics:
16
+ - type: accuracy
17
+ value: 77.68656716417911
18
+ - type: ap
19
+ value: 41.336896075573584
20
+ - type: f1
21
+ value: 71.788561468075
22
+ - task:
23
+ type: Classification
24
+ dataset:
25
+ type: mteb/amazon_polarity
26
+ name: MTEB AmazonPolarityClassification
27
+ config: default
28
+ split: test
29
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
30
+ metrics:
31
+ - type: accuracy
32
+ value: 90.04965
33
+ - type: ap
34
+ value: 86.24637009569418
35
+ - type: f1
36
+ value: 90.03896671762645
37
+ - task:
38
+ type: Classification
39
+ dataset:
40
+ type: mteb/amazon_reviews_multi
41
+ name: MTEB AmazonReviewsClassification (en)
42
+ config: en
43
+ split: test
44
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
45
+ metrics:
46
+ - type: accuracy
47
+ value: 43.016000000000005
48
+ - type: f1
49
+ value: 42.1942431880186
50
+ - task:
51
+ type: Retrieval
52
+ dataset:
53
+ type: arguana
54
+ name: MTEB ArguAna
55
+ config: default
56
+ split: test
57
+ revision: None
58
+ metrics:
59
+ - type: map_at_1
60
+ value: 25.107000000000003
61
+ - type: map_at_10
62
+ value: 40.464
63
+ - type: map_at_100
64
+ value: 41.577999999999996
65
+ - type: map_at_1000
66
+ value: 41.588
67
+ - type: map_at_3
68
+ value: 35.301
69
+ - type: map_at_5
70
+ value: 38.263000000000005
71
+ - type: mrr_at_1
72
+ value: 25.605
73
+ - type: mrr_at_10
74
+ value: 40.64
75
+ - type: mrr_at_100
76
+ value: 41.760000000000005
77
+ - type: mrr_at_1000
78
+ value: 41.77
79
+ - type: mrr_at_3
80
+ value: 35.443000000000005
81
+ - type: mrr_at_5
82
+ value: 38.448
83
+ - type: ndcg_at_1
84
+ value: 25.107000000000003
85
+ - type: ndcg_at_10
86
+ value: 49.352000000000004
87
+ - type: ndcg_at_100
88
+ value: 53.98500000000001
89
+ - type: ndcg_at_1000
90
+ value: 54.208
91
+ - type: ndcg_at_3
92
+ value: 38.671
93
+ - type: ndcg_at_5
94
+ value: 43.991
95
+ - type: precision_at_1
96
+ value: 25.107000000000003
97
+ - type: precision_at_10
98
+ value: 7.795000000000001
99
+ - type: precision_at_100
100
+ value: 0.979
101
+ - type: precision_at_1000
102
+ value: 0.1
103
+ - type: precision_at_3
104
+ value: 16.145
105
+ - type: precision_at_5
106
+ value: 12.262
107
+ - type: recall_at_1
108
+ value: 25.107000000000003
109
+ - type: recall_at_10
110
+ value: 77.952
111
+ - type: recall_at_100
112
+ value: 97.866
113
+ - type: recall_at_1000
114
+ value: 99.57300000000001
115
+ - type: recall_at_3
116
+ value: 48.435
117
+ - type: recall_at_5
118
+ value: 61.309000000000005
119
+ - task:
120
+ type: Clustering
121
+ dataset:
122
+ type: mteb/arxiv-clustering-p2p
123
+ name: MTEB ArxivClusteringP2P
124
+ config: default
125
+ split: test
126
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
127
+ metrics:
128
+ - type: v_measure
129
+ value: 46.19278045044154
130
+ - task:
131
+ type: Clustering
132
+ dataset:
133
+ type: mteb/arxiv-clustering-s2s
134
+ name: MTEB ArxivClusteringS2S
135
+ config: default
136
+ split: test
137
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
138
+ metrics:
139
+ - type: v_measure
140
+ value: 41.37976387757665
141
+ - task:
142
+ type: Reranking
143
+ dataset:
144
+ type: mteb/askubuntudupquestions-reranking
145
+ name: MTEB AskUbuntuDupQuestions
146
+ config: default
147
+ split: test
148
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
149
+ metrics:
150
+ - type: map
151
+ value: 60.07433334608074
152
+ - type: mrr
153
+ value: 73.44347711383723
154
+ - task:
155
+ type: STS
156
+ dataset:
157
+ type: mteb/biosses-sts
158
+ name: MTEB BIOSSES
159
+ config: default
160
+ split: test
161
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
162
+ metrics:
163
+ - type: cos_sim_pearson
164
+ value: 86.4298072183543
165
+ - type: cos_sim_spearman
166
+ value: 84.73144873582848
167
+ - type: euclidean_pearson
168
+ value: 85.15885058870728
169
+ - type: euclidean_spearman
170
+ value: 85.42062106559356
171
+ - type: manhattan_pearson
172
+ value: 84.89409921792054
173
+ - type: manhattan_spearman
174
+ value: 85.31941394024344
175
+ - task:
176
+ type: Classification
177
+ dataset:
178
+ type: mteb/banking77
179
+ name: MTEB Banking77Classification
180
+ config: default
181
+ split: test
182
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
183
+ metrics:
184
+ - type: accuracy
185
+ value: 84.14285714285714
186
+ - type: f1
187
+ value: 84.11674412565644
188
+ - task:
189
+ type: Clustering
190
+ dataset:
191
+ type: mteb/biorxiv-clustering-p2p
192
+ name: MTEB BiorxivClusteringP2P
193
+ config: default
194
+ split: test
195
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
196
+ metrics:
197
+ - type: v_measure
198
+ value: 37.600076342340785
199
+ - task:
200
+ type: Clustering
201
+ dataset:
202
+ type: mteb/biorxiv-clustering-s2s
203
+ name: MTEB BiorxivClusteringS2S
204
+ config: default
205
+ split: test
206
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
207
+ metrics:
208
+ - type: v_measure
209
+ value: 35.08861812135148
210
+ - task:
211
+ type: Retrieval
212
+ dataset:
213
+ type: BeIR/cqadupstack
214
+ name: MTEB CQADupstackAndroidRetrieval
215
+ config: default
216
+ split: test
217
+ revision: None
218
+ metrics:
219
+ - type: map_at_1
220
+ value: 32.684000000000005
221
+ - type: map_at_10
222
+ value: 41.675000000000004
223
+ - type: map_at_100
224
+ value: 42.963
225
+ - type: map_at_1000
226
+ value: 43.078
227
+ - type: map_at_3
228
+ value: 38.708999999999996
229
+ - type: map_at_5
230
+ value: 40.316
231
+ - type: mrr_at_1
232
+ value: 39.485
233
+ - type: mrr_at_10
234
+ value: 47.152
235
+ - type: mrr_at_100
236
+ value: 47.96
237
+ - type: mrr_at_1000
238
+ value: 48.010000000000005
239
+ - type: mrr_at_3
240
+ value: 44.754
241
+ - type: mrr_at_5
242
+ value: 46.285
243
+ - type: ndcg_at_1
244
+ value: 39.485
245
+ - type: ndcg_at_10
246
+ value: 46.849000000000004
247
+ - type: ndcg_at_100
248
+ value: 52.059
249
+ - type: ndcg_at_1000
250
+ value: 54.358
251
+ - type: ndcg_at_3
252
+ value: 42.705
253
+ - type: ndcg_at_5
254
+ value: 44.663000000000004
255
+ - type: precision_at_1
256
+ value: 39.485
257
+ - type: precision_at_10
258
+ value: 8.455
259
+ - type: precision_at_100
260
+ value: 1.3379999999999999
261
+ - type: precision_at_1000
262
+ value: 0.178
263
+ - type: precision_at_3
264
+ value: 19.695
265
+ - type: precision_at_5
266
+ value: 13.905999999999999
267
+ - type: recall_at_1
268
+ value: 32.684000000000005
269
+ - type: recall_at_10
270
+ value: 56.227000000000004
271
+ - type: recall_at_100
272
+ value: 78.499
273
+ - type: recall_at_1000
274
+ value: 94.021
275
+ - type: recall_at_3
276
+ value: 44.157999999999994
277
+ - type: recall_at_5
278
+ value: 49.694
279
+ - task:
280
+ type: Retrieval
281
+ dataset:
282
+ type: BeIR/cqadupstack
283
+ name: MTEB CQADupstackEnglishRetrieval
284
+ config: default
285
+ split: test
286
+ revision: None
287
+ metrics:
288
+ - type: map_at_1
289
+ value: 31.875999999999998
290
+ - type: map_at_10
291
+ value: 41.603
292
+ - type: map_at_100
293
+ value: 42.825
294
+ - type: map_at_1000
295
+ value: 42.961
296
+ - type: map_at_3
297
+ value: 38.655
298
+ - type: map_at_5
299
+ value: 40.294999999999995
300
+ - type: mrr_at_1
301
+ value: 40.127
302
+ - type: mrr_at_10
303
+ value: 47.959
304
+ - type: mrr_at_100
305
+ value: 48.59
306
+ - type: mrr_at_1000
307
+ value: 48.634
308
+ - type: mrr_at_3
309
+ value: 45.786
310
+ - type: mrr_at_5
311
+ value: 46.964
312
+ - type: ndcg_at_1
313
+ value: 40.127
314
+ - type: ndcg_at_10
315
+ value: 47.176
316
+ - type: ndcg_at_100
317
+ value: 51.346000000000004
318
+ - type: ndcg_at_1000
319
+ value: 53.502
320
+ - type: ndcg_at_3
321
+ value: 43.139
322
+ - type: ndcg_at_5
323
+ value: 44.883
324
+ - type: precision_at_1
325
+ value: 40.127
326
+ - type: precision_at_10
327
+ value: 8.72
328
+ - type: precision_at_100
329
+ value: 1.387
330
+ - type: precision_at_1000
331
+ value: 0.188
332
+ - type: precision_at_3
333
+ value: 20.637
334
+ - type: precision_at_5
335
+ value: 14.446
336
+ - type: recall_at_1
337
+ value: 31.875999999999998
338
+ - type: recall_at_10
339
+ value: 56.54900000000001
340
+ - type: recall_at_100
341
+ value: 73.939
342
+ - type: recall_at_1000
343
+ value: 87.732
344
+ - type: recall_at_3
345
+ value: 44.326
346
+ - type: recall_at_5
347
+ value: 49.445
348
+ - task:
349
+ type: Retrieval
350
+ dataset:
351
+ type: BeIR/cqadupstack
352
+ name: MTEB CQADupstackGamingRetrieval
353
+ config: default
354
+ split: test
355
+ revision: None
356
+ metrics:
357
+ - type: map_at_1
358
+ value: 41.677
359
+ - type: map_at_10
360
+ value: 52.222
361
+ - type: map_at_100
362
+ value: 53.229000000000006
363
+ - type: map_at_1000
364
+ value: 53.288000000000004
365
+ - type: map_at_3
366
+ value: 49.201
367
+ - type: map_at_5
368
+ value: 51.00599999999999
369
+ - type: mrr_at_1
370
+ value: 47.524
371
+ - type: mrr_at_10
372
+ value: 55.745999999999995
373
+ - type: mrr_at_100
374
+ value: 56.433
375
+ - type: mrr_at_1000
376
+ value: 56.464999999999996
377
+ - type: mrr_at_3
378
+ value: 53.37499999999999
379
+ - type: mrr_at_5
380
+ value: 54.858
381
+ - type: ndcg_at_1
382
+ value: 47.524
383
+ - type: ndcg_at_10
384
+ value: 57.406
385
+ - type: ndcg_at_100
386
+ value: 61.403
387
+ - type: ndcg_at_1000
388
+ value: 62.7
389
+ - type: ndcg_at_3
390
+ value: 52.298
391
+ - type: ndcg_at_5
392
+ value: 55.02
393
+ - type: precision_at_1
394
+ value: 47.524
395
+ - type: precision_at_10
396
+ value: 8.865
397
+ - type: precision_at_100
398
+ value: 1.179
399
+ - type: precision_at_1000
400
+ value: 0.134
401
+ - type: precision_at_3
402
+ value: 22.612
403
+ - type: precision_at_5
404
+ value: 15.461
405
+ - type: recall_at_1
406
+ value: 41.677
407
+ - type: recall_at_10
408
+ value: 69.346
409
+ - type: recall_at_100
410
+ value: 86.344
411
+ - type: recall_at_1000
412
+ value: 95.703
413
+ - type: recall_at_3
414
+ value: 55.789
415
+ - type: recall_at_5
416
+ value: 62.488
417
+ - task:
418
+ type: Retrieval
419
+ dataset:
420
+ type: BeIR/cqadupstack
421
+ name: MTEB CQADupstackGisRetrieval
422
+ config: default
423
+ split: test
424
+ revision: None
425
+ metrics:
426
+ - type: map_at_1
427
+ value: 25.991999999999997
428
+ - type: map_at_10
429
+ value: 32.804
430
+ - type: map_at_100
431
+ value: 33.812999999999995
432
+ - type: map_at_1000
433
+ value: 33.897
434
+ - type: map_at_3
435
+ value: 30.567
436
+ - type: map_at_5
437
+ value: 31.599
438
+ - type: mrr_at_1
439
+ value: 27.797
440
+ - type: mrr_at_10
441
+ value: 34.768
442
+ - type: mrr_at_100
443
+ value: 35.702
444
+ - type: mrr_at_1000
445
+ value: 35.766
446
+ - type: mrr_at_3
447
+ value: 32.637
448
+ - type: mrr_at_5
449
+ value: 33.614
450
+ - type: ndcg_at_1
451
+ value: 27.797
452
+ - type: ndcg_at_10
453
+ value: 36.966
454
+ - type: ndcg_at_100
455
+ value: 41.972
456
+ - type: ndcg_at_1000
457
+ value: 44.139
458
+ - type: ndcg_at_3
459
+ value: 32.547
460
+ - type: ndcg_at_5
461
+ value: 34.258
462
+ - type: precision_at_1
463
+ value: 27.797
464
+ - type: precision_at_10
465
+ value: 5.514
466
+ - type: precision_at_100
467
+ value: 0.8340000000000001
468
+ - type: precision_at_1000
469
+ value: 0.106
470
+ - type: precision_at_3
471
+ value: 13.333
472
+ - type: precision_at_5
473
+ value: 9.04
474
+ - type: recall_at_1
475
+ value: 25.991999999999997
476
+ - type: recall_at_10
477
+ value: 47.941
478
+ - type: recall_at_100
479
+ value: 71.039
480
+ - type: recall_at_1000
481
+ value: 87.32799999999999
482
+ - type: recall_at_3
483
+ value: 36.01
484
+ - type: recall_at_5
485
+ value: 40.056000000000004
486
+ - task:
487
+ type: Retrieval
488
+ dataset:
489
+ type: BeIR/cqadupstack
490
+ name: MTEB CQADupstackMathematicaRetrieval
491
+ config: default
492
+ split: test
493
+ revision: None
494
+ metrics:
495
+ - type: map_at_1
496
+ value: 17.533
497
+ - type: map_at_10
498
+ value: 24.336
499
+ - type: map_at_100
500
+ value: 25.445
501
+ - type: map_at_1000
502
+ value: 25.561
503
+ - type: map_at_3
504
+ value: 22.116
505
+ - type: map_at_5
506
+ value: 23.347
507
+ - type: mrr_at_1
508
+ value: 21.642
509
+ - type: mrr_at_10
510
+ value: 28.910999999999998
511
+ - type: mrr_at_100
512
+ value: 29.836000000000002
513
+ - type: mrr_at_1000
514
+ value: 29.907
515
+ - type: mrr_at_3
516
+ value: 26.638
517
+ - type: mrr_at_5
518
+ value: 27.857
519
+ - type: ndcg_at_1
520
+ value: 21.642
521
+ - type: ndcg_at_10
522
+ value: 28.949
523
+ - type: ndcg_at_100
524
+ value: 34.211000000000006
525
+ - type: ndcg_at_1000
526
+ value: 37.031
527
+ - type: ndcg_at_3
528
+ value: 24.788
529
+ - type: ndcg_at_5
530
+ value: 26.685
531
+ - type: precision_at_1
532
+ value: 21.642
533
+ - type: precision_at_10
534
+ value: 5.137
535
+ - type: precision_at_100
536
+ value: 0.893
537
+ - type: precision_at_1000
538
+ value: 0.127
539
+ - type: precision_at_3
540
+ value: 11.733
541
+ - type: precision_at_5
542
+ value: 8.383000000000001
543
+ - type: recall_at_1
544
+ value: 17.533
545
+ - type: recall_at_10
546
+ value: 38.839
547
+ - type: recall_at_100
548
+ value: 61.458999999999996
549
+ - type: recall_at_1000
550
+ value: 81.58
551
+ - type: recall_at_3
552
+ value: 27.328999999999997
553
+ - type: recall_at_5
554
+ value: 32.168
555
+ - task:
556
+ type: Retrieval
557
+ dataset:
558
+ type: BeIR/cqadupstack
559
+ name: MTEB CQADupstackPhysicsRetrieval
560
+ config: default
561
+ split: test
562
+ revision: None
563
+ metrics:
564
+ - type: map_at_1
565
+ value: 28.126
566
+ - type: map_at_10
567
+ value: 37.872
568
+ - type: map_at_100
569
+ value: 39.229
570
+ - type: map_at_1000
571
+ value: 39.353
572
+ - type: map_at_3
573
+ value: 34.93
574
+ - type: map_at_5
575
+ value: 36.59
576
+ - type: mrr_at_1
577
+ value: 34.071
578
+ - type: mrr_at_10
579
+ value: 43.056
580
+ - type: mrr_at_100
581
+ value: 43.944
582
+ - type: mrr_at_1000
583
+ value: 43.999
584
+ - type: mrr_at_3
585
+ value: 40.536
586
+ - type: mrr_at_5
587
+ value: 42.065999999999995
588
+ - type: ndcg_at_1
589
+ value: 34.071
590
+ - type: ndcg_at_10
591
+ value: 43.503
592
+ - type: ndcg_at_100
593
+ value: 49.120000000000005
594
+ - type: ndcg_at_1000
595
+ value: 51.410999999999994
596
+ - type: ndcg_at_3
597
+ value: 38.767
598
+ - type: ndcg_at_5
599
+ value: 41.075
600
+ - type: precision_at_1
601
+ value: 34.071
602
+ - type: precision_at_10
603
+ value: 7.843999999999999
604
+ - type: precision_at_100
605
+ value: 1.2489999999999999
606
+ - type: precision_at_1000
607
+ value: 0.163
608
+ - type: precision_at_3
609
+ value: 18.223
610
+ - type: precision_at_5
611
+ value: 13.050999999999998
612
+ - type: recall_at_1
613
+ value: 28.126
614
+ - type: recall_at_10
615
+ value: 54.952
616
+ - type: recall_at_100
617
+ value: 78.375
618
+ - type: recall_at_1000
619
+ value: 93.29899999999999
620
+ - type: recall_at_3
621
+ value: 41.714
622
+ - type: recall_at_5
623
+ value: 47.635
624
+ - task:
625
+ type: Retrieval
626
+ dataset:
627
+ type: BeIR/cqadupstack
628
+ name: MTEB CQADupstackProgrammersRetrieval
629
+ config: default
630
+ split: test
631
+ revision: None
632
+ metrics:
633
+ - type: map_at_1
634
+ value: 25.957
635
+ - type: map_at_10
636
+ value: 34.749
637
+ - type: map_at_100
638
+ value: 35.929
639
+ - type: map_at_1000
640
+ value: 36.043
641
+ - type: map_at_3
642
+ value: 31.947
643
+ - type: map_at_5
644
+ value: 33.575
645
+ - type: mrr_at_1
646
+ value: 32.078
647
+ - type: mrr_at_10
648
+ value: 39.844
649
+ - type: mrr_at_100
650
+ value: 40.71
651
+ - type: mrr_at_1000
652
+ value: 40.77
653
+ - type: mrr_at_3
654
+ value: 37.386
655
+ - type: mrr_at_5
656
+ value: 38.83
657
+ - type: ndcg_at_1
658
+ value: 32.078
659
+ - type: ndcg_at_10
660
+ value: 39.97
661
+ - type: ndcg_at_100
662
+ value: 45.254
663
+ - type: ndcg_at_1000
664
+ value: 47.818
665
+ - type: ndcg_at_3
666
+ value: 35.453
667
+ - type: ndcg_at_5
668
+ value: 37.631
669
+ - type: precision_at_1
670
+ value: 32.078
671
+ - type: precision_at_10
672
+ value: 7.158
673
+ - type: precision_at_100
674
+ value: 1.126
675
+ - type: precision_at_1000
676
+ value: 0.153
677
+ - type: precision_at_3
678
+ value: 16.743
679
+ - type: precision_at_5
680
+ value: 11.872
681
+ - type: recall_at_1
682
+ value: 25.957
683
+ - type: recall_at_10
684
+ value: 50.583
685
+ - type: recall_at_100
686
+ value: 73.593
687
+ - type: recall_at_1000
688
+ value: 91.23599999999999
689
+ - type: recall_at_3
690
+ value: 37.651
691
+ - type: recall_at_5
692
+ value: 43.626
693
+ - task:
694
+ type: Retrieval
695
+ dataset:
696
+ type: BeIR/cqadupstack
697
+ name: MTEB CQADupstackRetrieval
698
+ config: default
699
+ split: test
700
+ revision: None
701
+ metrics:
702
+ - type: map_at_1
703
+ value: 27.1505
704
+ - type: map_at_10
705
+ value: 34.844833333333334
706
+ - type: map_at_100
707
+ value: 35.95216666666667
708
+ - type: map_at_1000
709
+ value: 36.06675
710
+ - type: map_at_3
711
+ value: 32.41975
712
+ - type: map_at_5
713
+ value: 33.74233333333333
714
+ - type: mrr_at_1
715
+ value: 31.923666666666662
716
+ - type: mrr_at_10
717
+ value: 38.87983333333334
718
+ - type: mrr_at_100
719
+ value: 39.706250000000004
720
+ - type: mrr_at_1000
721
+ value: 39.76708333333333
722
+ - type: mrr_at_3
723
+ value: 36.72008333333333
724
+ - type: mrr_at_5
725
+ value: 37.96933333333334
726
+ - type: ndcg_at_1
727
+ value: 31.923666666666662
728
+ - type: ndcg_at_10
729
+ value: 39.44258333333334
730
+ - type: ndcg_at_100
731
+ value: 44.31475
732
+ - type: ndcg_at_1000
733
+ value: 46.75
734
+ - type: ndcg_at_3
735
+ value: 35.36299999999999
736
+ - type: ndcg_at_5
737
+ value: 37.242333333333335
738
+ - type: precision_at_1
739
+ value: 31.923666666666662
740
+ - type: precision_at_10
741
+ value: 6.643333333333333
742
+ - type: precision_at_100
743
+ value: 1.0612499999999998
744
+ - type: precision_at_1000
745
+ value: 0.14575
746
+ - type: precision_at_3
747
+ value: 15.875250000000001
748
+ - type: precision_at_5
749
+ value: 11.088916666666664
750
+ - type: recall_at_1
751
+ value: 27.1505
752
+ - type: recall_at_10
753
+ value: 49.06349999999999
754
+ - type: recall_at_100
755
+ value: 70.60841666666666
756
+ - type: recall_at_1000
757
+ value: 87.72049999999999
758
+ - type: recall_at_3
759
+ value: 37.60575000000001
760
+ - type: recall_at_5
761
+ value: 42.511166666666675
762
+ - task:
763
+ type: Retrieval
764
+ dataset:
765
+ type: BeIR/cqadupstack
766
+ name: MTEB CQADupstackStatsRetrieval
767
+ config: default
768
+ split: test
769
+ revision: None
770
+ metrics:
771
+ - type: map_at_1
772
+ value: 25.101000000000003
773
+ - type: map_at_10
774
+ value: 30.147000000000002
775
+ - type: map_at_100
776
+ value: 30.98
777
+ - type: map_at_1000
778
+ value: 31.080000000000002
779
+ - type: map_at_3
780
+ value: 28.571
781
+ - type: map_at_5
782
+ value: 29.319
783
+ - type: mrr_at_1
784
+ value: 27.761000000000003
785
+ - type: mrr_at_10
786
+ value: 32.716
787
+ - type: mrr_at_100
788
+ value: 33.504
789
+ - type: mrr_at_1000
790
+ value: 33.574
791
+ - type: mrr_at_3
792
+ value: 31.135
793
+ - type: mrr_at_5
794
+ value: 32.032
795
+ - type: ndcg_at_1
796
+ value: 27.761000000000003
797
+ - type: ndcg_at_10
798
+ value: 33.358
799
+ - type: ndcg_at_100
800
+ value: 37.569
801
+ - type: ndcg_at_1000
802
+ value: 40.189
803
+ - type: ndcg_at_3
804
+ value: 30.291
805
+ - type: ndcg_at_5
806
+ value: 31.558000000000003
807
+ - type: precision_at_1
808
+ value: 27.761000000000003
809
+ - type: precision_at_10
810
+ value: 4.939
811
+ - type: precision_at_100
812
+ value: 0.759
813
+ - type: precision_at_1000
814
+ value: 0.106
815
+ - type: precision_at_3
816
+ value: 12.577
817
+ - type: precision_at_5
818
+ value: 8.497
819
+ - type: recall_at_1
820
+ value: 25.101000000000003
821
+ - type: recall_at_10
822
+ value: 40.739
823
+ - type: recall_at_100
824
+ value: 60.089999999999996
825
+ - type: recall_at_1000
826
+ value: 79.768
827
+ - type: recall_at_3
828
+ value: 32.16
829
+ - type: recall_at_5
830
+ value: 35.131
831
+ - task:
832
+ type: Retrieval
833
+ dataset:
834
+ type: BeIR/cqadupstack
835
+ name: MTEB CQADupstackTexRetrieval
836
+ config: default
837
+ split: test
838
+ revision: None
839
+ metrics:
840
+ - type: map_at_1
841
+ value: 20.112
842
+ - type: map_at_10
843
+ value: 26.119999999999997
844
+ - type: map_at_100
845
+ value: 27.031
846
+ - type: map_at_1000
847
+ value: 27.150000000000002
848
+ - type: map_at_3
849
+ value: 24.230999999999998
850
+ - type: map_at_5
851
+ value: 25.15
852
+ - type: mrr_at_1
853
+ value: 24.535
854
+ - type: mrr_at_10
855
+ value: 30.198000000000004
856
+ - type: mrr_at_100
857
+ value: 30.975
858
+ - type: mrr_at_1000
859
+ value: 31.051000000000002
860
+ - type: mrr_at_3
861
+ value: 28.338
862
+ - type: mrr_at_5
863
+ value: 29.269000000000002
864
+ - type: ndcg_at_1
865
+ value: 24.535
866
+ - type: ndcg_at_10
867
+ value: 30.147000000000002
868
+ - type: ndcg_at_100
869
+ value: 34.544000000000004
870
+ - type: ndcg_at_1000
871
+ value: 37.512
872
+ - type: ndcg_at_3
873
+ value: 26.726
874
+ - type: ndcg_at_5
875
+ value: 28.046
876
+ - type: precision_at_1
877
+ value: 24.535
878
+ - type: precision_at_10
879
+ value: 5.179
880
+ - type: precision_at_100
881
+ value: 0.859
882
+ - type: precision_at_1000
883
+ value: 0.128
884
+ - type: precision_at_3
885
+ value: 12.159
886
+ - type: precision_at_5
887
+ value: 8.424
888
+ - type: recall_at_1
889
+ value: 20.112
890
+ - type: recall_at_10
891
+ value: 38.312000000000005
892
+ - type: recall_at_100
893
+ value: 58.406000000000006
894
+ - type: recall_at_1000
895
+ value: 79.863
896
+ - type: recall_at_3
897
+ value: 28.358
898
+ - type: recall_at_5
899
+ value: 31.973000000000003
900
+ - task:
901
+ type: Retrieval
902
+ dataset:
903
+ type: BeIR/cqadupstack
904
+ name: MTEB CQADupstackUnixRetrieval
905
+ config: default
906
+ split: test
907
+ revision: None
908
+ metrics:
909
+ - type: map_at_1
910
+ value: 27.111
911
+ - type: map_at_10
912
+ value: 34.096
913
+ - type: map_at_100
914
+ value: 35.181000000000004
915
+ - type: map_at_1000
916
+ value: 35.276
917
+ - type: map_at_3
918
+ value: 31.745
919
+ - type: map_at_5
920
+ value: 33.045
921
+ - type: mrr_at_1
922
+ value: 31.343
923
+ - type: mrr_at_10
924
+ value: 37.994
925
+ - type: mrr_at_100
926
+ value: 38.873000000000005
927
+ - type: mrr_at_1000
928
+ value: 38.934999999999995
929
+ - type: mrr_at_3
930
+ value: 35.743
931
+ - type: mrr_at_5
932
+ value: 37.077
933
+ - type: ndcg_at_1
934
+ value: 31.343
935
+ - type: ndcg_at_10
936
+ value: 38.572
937
+ - type: ndcg_at_100
938
+ value: 43.854
939
+ - type: ndcg_at_1000
940
+ value: 46.190999999999995
941
+ - type: ndcg_at_3
942
+ value: 34.247
943
+ - type: ndcg_at_5
944
+ value: 36.28
945
+ - type: precision_at_1
946
+ value: 31.343
947
+ - type: precision_at_10
948
+ value: 6.166
949
+ - type: precision_at_100
950
+ value: 1.0
951
+ - type: precision_at_1000
952
+ value: 0.13
953
+ - type: precision_at_3
954
+ value: 15.081
955
+ - type: precision_at_5
956
+ value: 10.428999999999998
957
+ - type: recall_at_1
958
+ value: 27.111
959
+ - type: recall_at_10
960
+ value: 48.422
961
+ - type: recall_at_100
962
+ value: 71.846
963
+ - type: recall_at_1000
964
+ value: 88.57000000000001
965
+ - type: recall_at_3
966
+ value: 36.435
967
+ - type: recall_at_5
968
+ value: 41.765
969
+ - task:
970
+ type: Retrieval
971
+ dataset:
972
+ type: BeIR/cqadupstack
973
+ name: MTEB CQADupstackWebmastersRetrieval
974
+ config: default
975
+ split: test
976
+ revision: None
977
+ metrics:
978
+ - type: map_at_1
979
+ value: 26.264
980
+ - type: map_at_10
981
+ value: 33.522
982
+ - type: map_at_100
983
+ value: 34.963
984
+ - type: map_at_1000
985
+ value: 35.175
986
+ - type: map_at_3
987
+ value: 31.366
988
+ - type: map_at_5
989
+ value: 32.621
990
+ - type: mrr_at_1
991
+ value: 31.028
992
+ - type: mrr_at_10
993
+ value: 37.230000000000004
994
+ - type: mrr_at_100
995
+ value: 38.149
996
+ - type: mrr_at_1000
997
+ value: 38.218
998
+ - type: mrr_at_3
999
+ value: 35.046
1000
+ - type: mrr_at_5
1001
+ value: 36.617
1002
+ - type: ndcg_at_1
1003
+ value: 31.028
1004
+ - type: ndcg_at_10
1005
+ value: 37.964999999999996
1006
+ - type: ndcg_at_100
1007
+ value: 43.342000000000006
1008
+ - type: ndcg_at_1000
1009
+ value: 46.471000000000004
1010
+ - type: ndcg_at_3
1011
+ value: 34.67
1012
+ - type: ndcg_at_5
1013
+ value: 36.458
1014
+ - type: precision_at_1
1015
+ value: 31.028
1016
+ - type: precision_at_10
1017
+ value: 6.937
1018
+ - type: precision_at_100
1019
+ value: 1.346
1020
+ - type: precision_at_1000
1021
+ value: 0.22799999999999998
1022
+ - type: precision_at_3
1023
+ value: 15.942
1024
+ - type: precision_at_5
1025
+ value: 11.462
1026
+ - type: recall_at_1
1027
+ value: 26.264
1028
+ - type: recall_at_10
1029
+ value: 45.571
1030
+ - type: recall_at_100
1031
+ value: 70.246
1032
+ - type: recall_at_1000
1033
+ value: 90.971
1034
+ - type: recall_at_3
1035
+ value: 36.276
1036
+ - type: recall_at_5
1037
+ value: 41.162
1038
+ - task:
1039
+ type: Retrieval
1040
+ dataset:
1041
+ type: BeIR/cqadupstack
1042
+ name: MTEB CQADupstackWordpressRetrieval
1043
+ config: default
1044
+ split: test
1045
+ revision: None
1046
+ metrics:
1047
+ - type: map_at_1
1048
+ value: 23.372999999999998
1049
+ - type: map_at_10
1050
+ value: 28.992
1051
+ - type: map_at_100
1052
+ value: 29.837999999999997
1053
+ - type: map_at_1000
1054
+ value: 29.939
1055
+ - type: map_at_3
1056
+ value: 26.999000000000002
1057
+ - type: map_at_5
1058
+ value: 28.044999999999998
1059
+ - type: mrr_at_1
1060
+ value: 25.692999999999998
1061
+ - type: mrr_at_10
1062
+ value: 30.984
1063
+ - type: mrr_at_100
1064
+ value: 31.799
1065
+ - type: mrr_at_1000
1066
+ value: 31.875999999999998
1067
+ - type: mrr_at_3
1068
+ value: 29.267
1069
+ - type: mrr_at_5
1070
+ value: 30.163
1071
+ - type: ndcg_at_1
1072
+ value: 25.692999999999998
1073
+ - type: ndcg_at_10
1074
+ value: 32.45
1075
+ - type: ndcg_at_100
1076
+ value: 37.103
1077
+ - type: ndcg_at_1000
1078
+ value: 39.678000000000004
1079
+ - type: ndcg_at_3
1080
+ value: 28.725
1081
+ - type: ndcg_at_5
1082
+ value: 30.351
1083
+ - type: precision_at_1
1084
+ value: 25.692999999999998
1085
+ - type: precision_at_10
1086
+ value: 4.806
1087
+ - type: precision_at_100
1088
+ value: 0.765
1089
+ - type: precision_at_1000
1090
+ value: 0.108
1091
+ - type: precision_at_3
1092
+ value: 11.768
1093
+ - type: precision_at_5
1094
+ value: 8.096
1095
+ - type: recall_at_1
1096
+ value: 23.372999999999998
1097
+ - type: recall_at_10
1098
+ value: 41.281
1099
+ - type: recall_at_100
1100
+ value: 63.465
1101
+ - type: recall_at_1000
1102
+ value: 82.575
1103
+ - type: recall_at_3
1104
+ value: 31.063000000000002
1105
+ - type: recall_at_5
1106
+ value: 34.991
1107
+ - task:
1108
+ type: Retrieval
1109
+ dataset:
1110
+ type: climate-fever
1111
+ name: MTEB ClimateFEVER
1112
+ config: default
1113
+ split: test
1114
+ revision: None
1115
+ metrics:
1116
+ - type: map_at_1
1117
+ value: 8.821
1118
+ - type: map_at_10
1119
+ value: 15.383
1120
+ - type: map_at_100
1121
+ value: 17.244999999999997
1122
+ - type: map_at_1000
1123
+ value: 17.445
1124
+ - type: map_at_3
1125
+ value: 12.64
1126
+ - type: map_at_5
1127
+ value: 13.941999999999998
1128
+ - type: mrr_at_1
1129
+ value: 19.544
1130
+ - type: mrr_at_10
1131
+ value: 29.738999999999997
1132
+ - type: mrr_at_100
1133
+ value: 30.923000000000002
1134
+ - type: mrr_at_1000
1135
+ value: 30.969
1136
+ - type: mrr_at_3
1137
+ value: 26.384
1138
+ - type: mrr_at_5
1139
+ value: 28.199
1140
+ - type: ndcg_at_1
1141
+ value: 19.544
1142
+ - type: ndcg_at_10
1143
+ value: 22.398
1144
+ - type: ndcg_at_100
1145
+ value: 30.253999999999998
1146
+ - type: ndcg_at_1000
1147
+ value: 33.876
1148
+ - type: ndcg_at_3
1149
+ value: 17.473
1150
+ - type: ndcg_at_5
1151
+ value: 19.154
1152
+ - type: precision_at_1
1153
+ value: 19.544
1154
+ - type: precision_at_10
1155
+ value: 7.217999999999999
1156
+ - type: precision_at_100
1157
+ value: 1.564
1158
+ - type: precision_at_1000
1159
+ value: 0.22300000000000003
1160
+ - type: precision_at_3
1161
+ value: 13.225000000000001
1162
+ - type: precision_at_5
1163
+ value: 10.319
1164
+ - type: recall_at_1
1165
+ value: 8.821
1166
+ - type: recall_at_10
1167
+ value: 28.110000000000003
1168
+ - type: recall_at_100
1169
+ value: 55.64
1170
+ - type: recall_at_1000
1171
+ value: 75.964
1172
+ - type: recall_at_3
1173
+ value: 16.195
1174
+ - type: recall_at_5
1175
+ value: 20.678
1176
+ - task:
1177
+ type: Retrieval
1178
+ dataset:
1179
+ type: dbpedia-entity
1180
+ name: MTEB DBPedia
1181
+ config: default
1182
+ split: test
1183
+ revision: None
1184
+ metrics:
1185
+ - type: map_at_1
1186
+ value: 9.344
1187
+ - type: map_at_10
1188
+ value: 20.301
1189
+ - type: map_at_100
1190
+ value: 28.709
1191
+ - type: map_at_1000
1192
+ value: 30.470999999999997
1193
+ - type: map_at_3
1194
+ value: 14.584
1195
+ - type: map_at_5
1196
+ value: 16.930999999999997
1197
+ - type: mrr_at_1
1198
+ value: 67.25
1199
+ - type: mrr_at_10
1200
+ value: 75.393
1201
+ - type: mrr_at_100
1202
+ value: 75.742
1203
+ - type: mrr_at_1000
1204
+ value: 75.75
1205
+ - type: mrr_at_3
1206
+ value: 73.958
1207
+ - type: mrr_at_5
1208
+ value: 74.883
1209
+ - type: ndcg_at_1
1210
+ value: 56.00000000000001
1211
+ - type: ndcg_at_10
1212
+ value: 42.394
1213
+ - type: ndcg_at_100
1214
+ value: 47.091
1215
+ - type: ndcg_at_1000
1216
+ value: 54.215
1217
+ - type: ndcg_at_3
1218
+ value: 46.995
1219
+ - type: ndcg_at_5
1220
+ value: 44.214999999999996
1221
+ - type: precision_at_1
1222
+ value: 67.25
1223
+ - type: precision_at_10
1224
+ value: 33.525
1225
+ - type: precision_at_100
1226
+ value: 10.67
1227
+ - type: precision_at_1000
1228
+ value: 2.221
1229
+ - type: precision_at_3
1230
+ value: 49.417
1231
+ - type: precision_at_5
1232
+ value: 42.15
1233
+ - type: recall_at_1
1234
+ value: 9.344
1235
+ - type: recall_at_10
1236
+ value: 25.209
1237
+ - type: recall_at_100
1238
+ value: 52.329
1239
+ - type: recall_at_1000
1240
+ value: 74.2
1241
+ - type: recall_at_3
1242
+ value: 15.699
1243
+ - type: recall_at_5
1244
+ value: 19.24
1245
+ - task:
1246
+ type: Classification
1247
+ dataset:
1248
+ type: mteb/emotion
1249
+ name: MTEB EmotionClassification
1250
+ config: default
1251
+ split: test
1252
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1253
+ metrics:
1254
+ - type: accuracy
1255
+ value: 48.05
1256
+ - type: f1
1257
+ value: 43.06718139212933
1258
+ - task:
1259
+ type: Retrieval
1260
+ dataset:
1261
+ type: fever
1262
+ name: MTEB FEVER
1263
+ config: default
1264
+ split: test
1265
+ revision: None
1266
+ metrics:
1267
+ - type: map_at_1
1268
+ value: 46.452
1269
+ - type: map_at_10
1270
+ value: 58.825
1271
+ - type: map_at_100
1272
+ value: 59.372
1273
+ - type: map_at_1000
1274
+ value: 59.399
1275
+ - type: map_at_3
1276
+ value: 56.264
1277
+ - type: map_at_5
1278
+ value: 57.879999999999995
1279
+ - type: mrr_at_1
1280
+ value: 49.82
1281
+ - type: mrr_at_10
1282
+ value: 62.178999999999995
1283
+ - type: mrr_at_100
1284
+ value: 62.641999999999996
1285
+ - type: mrr_at_1000
1286
+ value: 62.658
1287
+ - type: mrr_at_3
1288
+ value: 59.706
1289
+ - type: mrr_at_5
1290
+ value: 61.283
1291
+ - type: ndcg_at_1
1292
+ value: 49.82
1293
+ - type: ndcg_at_10
1294
+ value: 65.031
1295
+ - type: ndcg_at_100
1296
+ value: 67.413
1297
+ - type: ndcg_at_1000
1298
+ value: 68.014
1299
+ - type: ndcg_at_3
1300
+ value: 60.084
1301
+ - type: ndcg_at_5
1302
+ value: 62.858000000000004
1303
+ - type: precision_at_1
1304
+ value: 49.82
1305
+ - type: precision_at_10
1306
+ value: 8.876000000000001
1307
+ - type: precision_at_100
1308
+ value: 1.018
1309
+ - type: precision_at_1000
1310
+ value: 0.109
1311
+ - type: precision_at_3
1312
+ value: 24.477
1313
+ - type: precision_at_5
1314
+ value: 16.208
1315
+ - type: recall_at_1
1316
+ value: 46.452
1317
+ - type: recall_at_10
1318
+ value: 80.808
1319
+ - type: recall_at_100
1320
+ value: 91.215
1321
+ - type: recall_at_1000
1322
+ value: 95.52000000000001
1323
+ - type: recall_at_3
1324
+ value: 67.62899999999999
1325
+ - type: recall_at_5
1326
+ value: 74.32900000000001
1327
+ - task:
1328
+ type: Retrieval
1329
+ dataset:
1330
+ type: fiqa
1331
+ name: MTEB FiQA2018
1332
+ config: default
1333
+ split: test
1334
+ revision: None
1335
+ metrics:
1336
+ - type: map_at_1
1337
+ value: 18.351
1338
+ - type: map_at_10
1339
+ value: 30.796
1340
+ - type: map_at_100
1341
+ value: 32.621
1342
+ - type: map_at_1000
1343
+ value: 32.799
1344
+ - type: map_at_3
1345
+ value: 26.491
1346
+ - type: map_at_5
1347
+ value: 28.933999999999997
1348
+ - type: mrr_at_1
1349
+ value: 36.265
1350
+ - type: mrr_at_10
1351
+ value: 45.556999999999995
1352
+ - type: mrr_at_100
1353
+ value: 46.323
1354
+ - type: mrr_at_1000
1355
+ value: 46.359
1356
+ - type: mrr_at_3
1357
+ value: 42.695
1358
+ - type: mrr_at_5
1359
+ value: 44.324000000000005
1360
+ - type: ndcg_at_1
1361
+ value: 36.265
1362
+ - type: ndcg_at_10
1363
+ value: 38.558
1364
+ - type: ndcg_at_100
1365
+ value: 45.18
1366
+ - type: ndcg_at_1000
1367
+ value: 48.292
1368
+ - type: ndcg_at_3
1369
+ value: 34.204
1370
+ - type: ndcg_at_5
1371
+ value: 35.735
1372
+ - type: precision_at_1
1373
+ value: 36.265
1374
+ - type: precision_at_10
1375
+ value: 10.879999999999999
1376
+ - type: precision_at_100
1377
+ value: 1.77
1378
+ - type: precision_at_1000
1379
+ value: 0.234
1380
+ - type: precision_at_3
1381
+ value: 23.044999999999998
1382
+ - type: precision_at_5
1383
+ value: 17.253
1384
+ - type: recall_at_1
1385
+ value: 18.351
1386
+ - type: recall_at_10
1387
+ value: 46.116
1388
+ - type: recall_at_100
1389
+ value: 70.786
1390
+ - type: recall_at_1000
1391
+ value: 89.46300000000001
1392
+ - type: recall_at_3
1393
+ value: 31.404
1394
+ - type: recall_at_5
1395
+ value: 37.678
1396
+ - task:
1397
+ type: Retrieval
1398
+ dataset:
1399
+ type: hotpotqa
1400
+ name: MTEB HotpotQA
1401
+ config: default
1402
+ split: test
1403
+ revision: None
1404
+ metrics:
1405
+ - type: map_at_1
1406
+ value: 36.847
1407
+ - type: map_at_10
1408
+ value: 54.269999999999996
1409
+ - type: map_at_100
1410
+ value: 55.152
1411
+ - type: map_at_1000
1412
+ value: 55.223
1413
+ - type: map_at_3
1414
+ value: 51.166
1415
+ - type: map_at_5
1416
+ value: 53.055
1417
+ - type: mrr_at_1
1418
+ value: 73.693
1419
+ - type: mrr_at_10
1420
+ value: 79.975
1421
+ - type: mrr_at_100
1422
+ value: 80.202
1423
+ - type: mrr_at_1000
1424
+ value: 80.214
1425
+ - type: mrr_at_3
1426
+ value: 78.938
1427
+ - type: mrr_at_5
1428
+ value: 79.595
1429
+ - type: ndcg_at_1
1430
+ value: 73.693
1431
+ - type: ndcg_at_10
1432
+ value: 63.334999999999994
1433
+ - type: ndcg_at_100
1434
+ value: 66.452
1435
+ - type: ndcg_at_1000
1436
+ value: 67.869
1437
+ - type: ndcg_at_3
1438
+ value: 58.829
1439
+ - type: ndcg_at_5
1440
+ value: 61.266
1441
+ - type: precision_at_1
1442
+ value: 73.693
1443
+ - type: precision_at_10
1444
+ value: 13.122
1445
+ - type: precision_at_100
1446
+ value: 1.5559999999999998
1447
+ - type: precision_at_1000
1448
+ value: 0.174
1449
+ - type: precision_at_3
1450
+ value: 37.083
1451
+ - type: precision_at_5
1452
+ value: 24.169999999999998
1453
+ - type: recall_at_1
1454
+ value: 36.847
1455
+ - type: recall_at_10
1456
+ value: 65.61099999999999
1457
+ - type: recall_at_100
1458
+ value: 77.792
1459
+ - type: recall_at_1000
1460
+ value: 87.17099999999999
1461
+ - type: recall_at_3
1462
+ value: 55.625
1463
+ - type: recall_at_5
1464
+ value: 60.425
1465
+ - task:
1466
+ type: Classification
1467
+ dataset:
1468
+ type: mteb/imdb
1469
+ name: MTEB ImdbClassification
1470
+ config: default
1471
+ split: test
1472
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1473
+ metrics:
1474
+ - type: accuracy
1475
+ value: 82.1096
1476
+ - type: ap
1477
+ value: 76.67089212843918
1478
+ - type: f1
1479
+ value: 82.03535056754939
1480
+ - task:
1481
+ type: Retrieval
1482
+ dataset:
1483
+ type: msmarco
1484
+ name: MTEB MSMARCO
1485
+ config: default
1486
+ split: dev
1487
+ revision: None
1488
+ metrics:
1489
+ - type: map_at_1
1490
+ value: 24.465
1491
+ - type: map_at_10
1492
+ value: 37.072
1493
+ - type: map_at_100
1494
+ value: 38.188
1495
+ - type: map_at_1000
1496
+ value: 38.232
1497
+ - type: map_at_3
1498
+ value: 33.134
1499
+ - type: map_at_5
1500
+ value: 35.453
1501
+ - type: mrr_at_1
1502
+ value: 25.142999999999997
1503
+ - type: mrr_at_10
1504
+ value: 37.669999999999995
1505
+ - type: mrr_at_100
1506
+ value: 38.725
1507
+ - type: mrr_at_1000
1508
+ value: 38.765
1509
+ - type: mrr_at_3
1510
+ value: 33.82
1511
+ - type: mrr_at_5
1512
+ value: 36.111
1513
+ - type: ndcg_at_1
1514
+ value: 25.142999999999997
1515
+ - type: ndcg_at_10
1516
+ value: 44.054
1517
+ - type: ndcg_at_100
1518
+ value: 49.364000000000004
1519
+ - type: ndcg_at_1000
1520
+ value: 50.456
1521
+ - type: ndcg_at_3
1522
+ value: 36.095
1523
+ - type: ndcg_at_5
1524
+ value: 40.23
1525
+ - type: precision_at_1
1526
+ value: 25.142999999999997
1527
+ - type: precision_at_10
1528
+ value: 6.845
1529
+ - type: precision_at_100
1530
+ value: 0.95
1531
+ - type: precision_at_1000
1532
+ value: 0.104
1533
+ - type: precision_at_3
1534
+ value: 15.204999999999998
1535
+ - type: precision_at_5
1536
+ value: 11.221
1537
+ - type: recall_at_1
1538
+ value: 24.465
1539
+ - type: recall_at_10
1540
+ value: 65.495
1541
+ - type: recall_at_100
1542
+ value: 89.888
1543
+ - type: recall_at_1000
1544
+ value: 98.165
1545
+ - type: recall_at_3
1546
+ value: 43.964
1547
+ - type: recall_at_5
1548
+ value: 53.891
1549
+ - task:
1550
+ type: Classification
1551
+ dataset:
1552
+ type: mteb/mtop_domain
1553
+ name: MTEB MTOPDomainClassification (en)
1554
+ config: en
1555
+ split: test
1556
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1557
+ metrics:
1558
+ - type: accuracy
1559
+ value: 93.86228910168718
1560
+ - type: f1
1561
+ value: 93.69177113259104
1562
+ - task:
1563
+ type: Classification
1564
+ dataset:
1565
+ type: mteb/mtop_intent
1566
+ name: MTEB MTOPIntentClassification (en)
1567
+ config: en
1568
+ split: test
1569
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1570
+ metrics:
1571
+ - type: accuracy
1572
+ value: 76.3999088007296
1573
+ - type: f1
1574
+ value: 58.96668664333438
1575
+ - task:
1576
+ type: Classification
1577
+ dataset:
1578
+ type: mteb/amazon_massive_intent
1579
+ name: MTEB MassiveIntentClassification (en)
1580
+ config: en
1581
+ split: test
1582
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1583
+ metrics:
1584
+ - type: accuracy
1585
+ value: 73.21788836583727
1586
+ - type: f1
1587
+ value: 71.4545936552952
1588
+ - task:
1589
+ type: Classification
1590
+ dataset:
1591
+ type: mteb/amazon_massive_scenario
1592
+ name: MTEB MassiveScenarioClassification (en)
1593
+ config: en
1594
+ split: test
1595
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1596
+ metrics:
1597
+ - type: accuracy
1598
+ value: 77.39071956960323
1599
+ - type: f1
1600
+ value: 77.12398952847603
1601
+ - task:
1602
+ type: Clustering
1603
+ dataset:
1604
+ type: mteb/medrxiv-clustering-p2p
1605
+ name: MTEB MedrxivClusteringP2P
1606
+ config: default
1607
+ split: test
1608
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1609
+ metrics:
1610
+ - type: v_measure
1611
+ value: 32.255379528166955
1612
+ - task:
1613
+ type: Clustering
1614
+ dataset:
1615
+ type: mteb/medrxiv-clustering-s2s
1616
+ name: MTEB MedrxivClusteringS2S
1617
+ config: default
1618
+ split: test
1619
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1620
+ metrics:
1621
+ - type: v_measure
1622
+ value: 29.66423362872814
1623
+ - task:
1624
+ type: Reranking
1625
+ dataset:
1626
+ type: mteb/mind_small
1627
+ name: MTEB MindSmallReranking
1628
+ config: default
1629
+ split: test
1630
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1631
+ metrics:
1632
+ - type: map
1633
+ value: 30.782211620375964
1634
+ - type: mrr
1635
+ value: 31.773479703044956
1636
+ - task:
1637
+ type: Retrieval
1638
+ dataset:
1639
+ type: nfcorpus
1640
+ name: MTEB NFCorpus
1641
+ config: default
1642
+ split: test
1643
+ revision: None
1644
+ metrics:
1645
+ - type: map_at_1
1646
+ value: 5.863
1647
+ - type: map_at_10
1648
+ value: 13.831
1649
+ - type: map_at_100
1650
+ value: 17.534
1651
+ - type: map_at_1000
1652
+ value: 19.012
1653
+ - type: map_at_3
1654
+ value: 10.143
1655
+ - type: map_at_5
1656
+ value: 12.034
1657
+ - type: mrr_at_1
1658
+ value: 46.749
1659
+ - type: mrr_at_10
1660
+ value: 55.376999999999995
1661
+ - type: mrr_at_100
1662
+ value: 56.009
1663
+ - type: mrr_at_1000
1664
+ value: 56.042
1665
+ - type: mrr_at_3
1666
+ value: 53.30200000000001
1667
+ - type: mrr_at_5
1668
+ value: 54.85
1669
+ - type: ndcg_at_1
1670
+ value: 44.582
1671
+ - type: ndcg_at_10
1672
+ value: 36.07
1673
+ - type: ndcg_at_100
1674
+ value: 33.39
1675
+ - type: ndcg_at_1000
1676
+ value: 41.884
1677
+ - type: ndcg_at_3
1678
+ value: 41.441
1679
+ - type: ndcg_at_5
1680
+ value: 39.861000000000004
1681
+ - type: precision_at_1
1682
+ value: 46.129999999999995
1683
+ - type: precision_at_10
1684
+ value: 26.594
1685
+ - type: precision_at_100
1686
+ value: 8.365
1687
+ - type: precision_at_1000
1688
+ value: 2.1260000000000003
1689
+ - type: precision_at_3
1690
+ value: 39.009
1691
+ - type: precision_at_5
1692
+ value: 34.861
1693
+ - type: recall_at_1
1694
+ value: 5.863
1695
+ - type: recall_at_10
1696
+ value: 17.961
1697
+ - type: recall_at_100
1698
+ value: 34.026
1699
+ - type: recall_at_1000
1700
+ value: 64.46499999999999
1701
+ - type: recall_at_3
1702
+ value: 11.242
1703
+ - type: recall_at_5
1704
+ value: 14.493
1705
+ - task:
1706
+ type: Retrieval
1707
+ dataset:
1708
+ type: nq
1709
+ name: MTEB NQ
1710
+ config: default
1711
+ split: test
1712
+ revision: None
1713
+ metrics:
1714
+ - type: map_at_1
1715
+ value: 38.601
1716
+ - type: map_at_10
1717
+ value: 55.293000000000006
1718
+ - type: map_at_100
1719
+ value: 56.092
1720
+ - type: map_at_1000
1721
+ value: 56.111999999999995
1722
+ - type: map_at_3
1723
+ value: 51.269
1724
+ - type: map_at_5
1725
+ value: 53.787
1726
+ - type: mrr_at_1
1727
+ value: 43.221
1728
+ - type: mrr_at_10
1729
+ value: 57.882999999999996
1730
+ - type: mrr_at_100
1731
+ value: 58.408
1732
+ - type: mrr_at_1000
1733
+ value: 58.421
1734
+ - type: mrr_at_3
1735
+ value: 54.765
1736
+ - type: mrr_at_5
1737
+ value: 56.809
1738
+ - type: ndcg_at_1
1739
+ value: 43.221
1740
+ - type: ndcg_at_10
1741
+ value: 62.858999999999995
1742
+ - type: ndcg_at_100
1743
+ value: 65.987
1744
+ - type: ndcg_at_1000
1745
+ value: 66.404
1746
+ - type: ndcg_at_3
1747
+ value: 55.605000000000004
1748
+ - type: ndcg_at_5
1749
+ value: 59.723000000000006
1750
+ - type: precision_at_1
1751
+ value: 43.221
1752
+ - type: precision_at_10
1753
+ value: 9.907
1754
+ - type: precision_at_100
1755
+ value: 1.169
1756
+ - type: precision_at_1000
1757
+ value: 0.121
1758
+ - type: precision_at_3
1759
+ value: 25.019000000000002
1760
+ - type: precision_at_5
1761
+ value: 17.474
1762
+ - type: recall_at_1
1763
+ value: 38.601
1764
+ - type: recall_at_10
1765
+ value: 82.966
1766
+ - type: recall_at_100
1767
+ value: 96.154
1768
+ - type: recall_at_1000
1769
+ value: 99.223
1770
+ - type: recall_at_3
1771
+ value: 64.603
1772
+ - type: recall_at_5
1773
+ value: 73.97200000000001
1774
+ - task:
1775
+ type: Retrieval
1776
+ dataset:
1777
+ type: quora
1778
+ name: MTEB QuoraRetrieval
1779
+ config: default
1780
+ split: test
1781
+ revision: None
1782
+ metrics:
1783
+ - type: map_at_1
1784
+ value: 70.77
1785
+ - type: map_at_10
1786
+ value: 84.429
1787
+ - type: map_at_100
1788
+ value: 85.04599999999999
1789
+ - type: map_at_1000
1790
+ value: 85.065
1791
+ - type: map_at_3
1792
+ value: 81.461
1793
+ - type: map_at_5
1794
+ value: 83.316
1795
+ - type: mrr_at_1
1796
+ value: 81.51
1797
+ - type: mrr_at_10
1798
+ value: 87.52799999999999
1799
+ - type: mrr_at_100
1800
+ value: 87.631
1801
+ - type: mrr_at_1000
1802
+ value: 87.632
1803
+ - type: mrr_at_3
1804
+ value: 86.533
1805
+ - type: mrr_at_5
1806
+ value: 87.214
1807
+ - type: ndcg_at_1
1808
+ value: 81.47999999999999
1809
+ - type: ndcg_at_10
1810
+ value: 88.181
1811
+ - type: ndcg_at_100
1812
+ value: 89.39200000000001
1813
+ - type: ndcg_at_1000
1814
+ value: 89.52
1815
+ - type: ndcg_at_3
1816
+ value: 85.29299999999999
1817
+ - type: ndcg_at_5
1818
+ value: 86.88
1819
+ - type: precision_at_1
1820
+ value: 81.47999999999999
1821
+ - type: precision_at_10
1822
+ value: 13.367
1823
+ - type: precision_at_100
1824
+ value: 1.5230000000000001
1825
+ - type: precision_at_1000
1826
+ value: 0.157
1827
+ - type: precision_at_3
1828
+ value: 37.227
1829
+ - type: precision_at_5
1830
+ value: 24.494
1831
+ - type: recall_at_1
1832
+ value: 70.77
1833
+ - type: recall_at_10
1834
+ value: 95.199
1835
+ - type: recall_at_100
1836
+ value: 99.37700000000001
1837
+ - type: recall_at_1000
1838
+ value: 99.973
1839
+ - type: recall_at_3
1840
+ value: 86.895
1841
+ - type: recall_at_5
1842
+ value: 91.396
1843
+ - task:
1844
+ type: Clustering
1845
+ dataset:
1846
+ type: mteb/reddit-clustering
1847
+ name: MTEB RedditClustering
1848
+ config: default
1849
+ split: test
1850
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1851
+ metrics:
1852
+ - type: v_measure
1853
+ value: 50.686353396858344
1854
+ - task:
1855
+ type: Clustering
1856
+ dataset:
1857
+ type: mteb/reddit-clustering-p2p
1858
+ name: MTEB RedditClusteringP2P
1859
+ config: default
1860
+ split: test
1861
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1862
+ metrics:
1863
+ - type: v_measure
1864
+ value: 61.3664675312921
1865
+ - task:
1866
+ type: Retrieval
1867
+ dataset:
1868
+ type: scidocs
1869
+ name: MTEB SCIDOCS
1870
+ config: default
1871
+ split: test
1872
+ revision: None
1873
+ metrics:
1874
+ - type: map_at_1
1875
+ value: 4.7379999999999995
1876
+ - type: map_at_10
1877
+ value: 12.01
1878
+ - type: map_at_100
1879
+ value: 14.02
1880
+ - type: map_at_1000
1881
+ value: 14.310999999999998
1882
+ - type: map_at_3
1883
+ value: 8.459
1884
+ - type: map_at_5
1885
+ value: 10.281
1886
+ - type: mrr_at_1
1887
+ value: 23.3
1888
+ - type: mrr_at_10
1889
+ value: 34.108
1890
+ - type: mrr_at_100
1891
+ value: 35.217
1892
+ - type: mrr_at_1000
1893
+ value: 35.272
1894
+ - type: mrr_at_3
1895
+ value: 30.833
1896
+ - type: mrr_at_5
1897
+ value: 32.768
1898
+ - type: ndcg_at_1
1899
+ value: 23.3
1900
+ - type: ndcg_at_10
1901
+ value: 20.116999999999997
1902
+ - type: ndcg_at_100
1903
+ value: 27.961000000000002
1904
+ - type: ndcg_at_1000
1905
+ value: 33.149
1906
+ - type: ndcg_at_3
1907
+ value: 18.902
1908
+ - type: ndcg_at_5
1909
+ value: 16.742
1910
+ - type: precision_at_1
1911
+ value: 23.3
1912
+ - type: precision_at_10
1913
+ value: 10.47
1914
+ - type: precision_at_100
1915
+ value: 2.177
1916
+ - type: precision_at_1000
1917
+ value: 0.34299999999999997
1918
+ - type: precision_at_3
1919
+ value: 17.567
1920
+ - type: precision_at_5
1921
+ value: 14.78
1922
+ - type: recall_at_1
1923
+ value: 4.7379999999999995
1924
+ - type: recall_at_10
1925
+ value: 21.221999999999998
1926
+ - type: recall_at_100
1927
+ value: 44.242
1928
+ - type: recall_at_1000
1929
+ value: 69.652
1930
+ - type: recall_at_3
1931
+ value: 10.688
1932
+ - type: recall_at_5
1933
+ value: 14.982999999999999
1934
+ - task:
1935
+ type: STS
1936
+ dataset:
1937
+ type: mteb/sickr-sts
1938
+ name: MTEB SICK-R
1939
+ config: default
1940
+ split: test
1941
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1942
+ metrics:
1943
+ - type: cos_sim_pearson
1944
+ value: 84.84572946827069
1945
+ - type: cos_sim_spearman
1946
+ value: 80.48508130408966
1947
+ - type: euclidean_pearson
1948
+ value: 82.0481530027767
1949
+ - type: euclidean_spearman
1950
+ value: 80.45902876782752
1951
+ - type: manhattan_pearson
1952
+ value: 82.03728222483326
1953
+ - type: manhattan_spearman
1954
+ value: 80.45684282911755
1955
+ - task:
1956
+ type: STS
1957
+ dataset:
1958
+ type: mteb/sts12-sts
1959
+ name: MTEB STS12
1960
+ config: default
1961
+ split: test
1962
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1963
+ metrics:
1964
+ - type: cos_sim_pearson
1965
+ value: 84.33476464677516
1966
+ - type: cos_sim_spearman
1967
+ value: 75.93057758003266
1968
+ - type: euclidean_pearson
1969
+ value: 80.89685744015691
1970
+ - type: euclidean_spearman
1971
+ value: 76.29929953441706
1972
+ - type: manhattan_pearson
1973
+ value: 80.91391345459995
1974
+ - type: manhattan_spearman
1975
+ value: 76.31985463110914
1976
+ - task:
1977
+ type: STS
1978
+ dataset:
1979
+ type: mteb/sts13-sts
1980
+ name: MTEB STS13
1981
+ config: default
1982
+ split: test
1983
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1984
+ metrics:
1985
+ - type: cos_sim_pearson
1986
+ value: 84.63686106359005
1987
+ - type: cos_sim_spearman
1988
+ value: 85.22240034668202
1989
+ - type: euclidean_pearson
1990
+ value: 84.6074814189106
1991
+ - type: euclidean_spearman
1992
+ value: 85.17169644755828
1993
+ - type: manhattan_pearson
1994
+ value: 84.48329306239368
1995
+ - type: manhattan_spearman
1996
+ value: 85.0086508544768
1997
+ - task:
1998
+ type: STS
1999
+ dataset:
2000
+ type: mteb/sts14-sts
2001
+ name: MTEB STS14
2002
+ config: default
2003
+ split: test
2004
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
2005
+ metrics:
2006
+ - type: cos_sim_pearson
2007
+ value: 82.95455774064745
2008
+ - type: cos_sim_spearman
2009
+ value: 80.54074646118492
2010
+ - type: euclidean_pearson
2011
+ value: 81.79598955554704
2012
+ - type: euclidean_spearman
2013
+ value: 80.55837617606814
2014
+ - type: manhattan_pearson
2015
+ value: 81.78213797905386
2016
+ - type: manhattan_spearman
2017
+ value: 80.5666746878273
2018
+ - task:
2019
+ type: STS
2020
+ dataset:
2021
+ type: mteb/sts15-sts
2022
+ name: MTEB STS15
2023
+ config: default
2024
+ split: test
2025
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
2026
+ metrics:
2027
+ - type: cos_sim_pearson
2028
+ value: 87.92813309124739
2029
+ - type: cos_sim_spearman
2030
+ value: 88.81459873052108
2031
+ - type: euclidean_pearson
2032
+ value: 88.21193118930564
2033
+ - type: euclidean_spearman
2034
+ value: 88.87072745043731
2035
+ - type: manhattan_pearson
2036
+ value: 88.22576929706727
2037
+ - type: manhattan_spearman
2038
+ value: 88.8867671095791
2039
+ - task:
2040
+ type: STS
2041
+ dataset:
2042
+ type: mteb/sts16-sts
2043
+ name: MTEB STS16
2044
+ config: default
2045
+ split: test
2046
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
2047
+ metrics:
2048
+ - type: cos_sim_pearson
2049
+ value: 83.6881529671839
2050
+ - type: cos_sim_spearman
2051
+ value: 85.2807092969554
2052
+ - type: euclidean_pearson
2053
+ value: 84.62334178652704
2054
+ - type: euclidean_spearman
2055
+ value: 85.2116373296784
2056
+ - type: manhattan_pearson
2057
+ value: 84.54948211541777
2058
+ - type: manhattan_spearman
2059
+ value: 85.10737722637882
2060
+ - task:
2061
+ type: STS
2062
+ dataset:
2063
+ type: mteb/sts17-crosslingual-sts
2064
+ name: MTEB STS17 (en-en)
2065
+ config: en-en
2066
+ split: test
2067
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
2068
+ metrics:
2069
+ - type: cos_sim_pearson
2070
+ value: 88.55963694458408
2071
+ - type: cos_sim_spearman
2072
+ value: 89.36731628848683
2073
+ - type: euclidean_pearson
2074
+ value: 89.64975952985465
2075
+ - type: euclidean_spearman
2076
+ value: 89.29689484033007
2077
+ - type: manhattan_pearson
2078
+ value: 89.61234491713135
2079
+ - type: manhattan_spearman
2080
+ value: 89.20302520255782
2081
+ - task:
2082
+ type: STS
2083
+ dataset:
2084
+ type: mteb/sts22-crosslingual-sts
2085
+ name: MTEB STS22 (en)
2086
+ config: en
2087
+ split: test
2088
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2089
+ metrics:
2090
+ - type: cos_sim_pearson
2091
+ value: 62.411800961903886
2092
+ - type: cos_sim_spearman
2093
+ value: 62.99105515749963
2094
+ - type: euclidean_pearson
2095
+ value: 65.29826669549443
2096
+ - type: euclidean_spearman
2097
+ value: 63.29880964105775
2098
+ - type: manhattan_pearson
2099
+ value: 65.00126190601183
2100
+ - type: manhattan_spearman
2101
+ value: 63.32011025899179
2102
+ - task:
2103
+ type: STS
2104
+ dataset:
2105
+ type: mteb/stsbenchmark-sts
2106
+ name: MTEB STSBenchmark
2107
+ config: default
2108
+ split: test
2109
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2110
+ metrics:
2111
+ - type: cos_sim_pearson
2112
+ value: 85.83498531837608
2113
+ - type: cos_sim_spearman
2114
+ value: 87.21366640615442
2115
+ - type: euclidean_pearson
2116
+ value: 86.74764288798261
2117
+ - type: euclidean_spearman
2118
+ value: 87.06060470780834
2119
+ - type: manhattan_pearson
2120
+ value: 86.65971223951476
2121
+ - type: manhattan_spearman
2122
+ value: 86.99814399831457
2123
+ - task:
2124
+ type: Reranking
2125
+ dataset:
2126
+ type: mteb/scidocs-reranking
2127
+ name: MTEB SciDocsRR
2128
+ config: default
2129
+ split: test
2130
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2131
+ metrics:
2132
+ - type: map
2133
+ value: 83.94448463485881
2134
+ - type: mrr
2135
+ value: 95.36291867174221
2136
+ - task:
2137
+ type: Retrieval
2138
+ dataset:
2139
+ type: scifact
2140
+ name: MTEB SciFact
2141
+ config: default
2142
+ split: test
2143
+ revision: None
2144
+ metrics:
2145
+ - type: map_at_1
2146
+ value: 59.928000000000004
2147
+ - type: map_at_10
2148
+ value: 68.577
2149
+ - type: map_at_100
2150
+ value: 69.35900000000001
2151
+ - type: map_at_1000
2152
+ value: 69.37299999999999
2153
+ - type: map_at_3
2154
+ value: 66.217
2155
+ - type: map_at_5
2156
+ value: 67.581
2157
+ - type: mrr_at_1
2158
+ value: 63.0
2159
+ - type: mrr_at_10
2160
+ value: 69.994
2161
+ - type: mrr_at_100
2162
+ value: 70.553
2163
+ - type: mrr_at_1000
2164
+ value: 70.56700000000001
2165
+ - type: mrr_at_3
2166
+ value: 68.167
2167
+ - type: mrr_at_5
2168
+ value: 69.11699999999999
2169
+ - type: ndcg_at_1
2170
+ value: 63.0
2171
+ - type: ndcg_at_10
2172
+ value: 72.58
2173
+ - type: ndcg_at_100
2174
+ value: 75.529
2175
+ - type: ndcg_at_1000
2176
+ value: 76.009
2177
+ - type: ndcg_at_3
2178
+ value: 68.523
2179
+ - type: ndcg_at_5
2180
+ value: 70.301
2181
+ - type: precision_at_1
2182
+ value: 63.0
2183
+ - type: precision_at_10
2184
+ value: 9.333
2185
+ - type: precision_at_100
2186
+ value: 1.09
2187
+ - type: precision_at_1000
2188
+ value: 0.11299999999999999
2189
+ - type: precision_at_3
2190
+ value: 26.444000000000003
2191
+ - type: precision_at_5
2192
+ value: 17.067
2193
+ - type: recall_at_1
2194
+ value: 59.928000000000004
2195
+ - type: recall_at_10
2196
+ value: 83.544
2197
+ - type: recall_at_100
2198
+ value: 96.0
2199
+ - type: recall_at_1000
2200
+ value: 100.0
2201
+ - type: recall_at_3
2202
+ value: 72.072
2203
+ - type: recall_at_5
2204
+ value: 76.683
2205
+ - task:
2206
+ type: PairClassification
2207
+ dataset:
2208
+ type: mteb/sprintduplicatequestions-pairclassification
2209
+ name: MTEB SprintDuplicateQuestions
2210
+ config: default
2211
+ split: test
2212
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2213
+ metrics:
2214
+ - type: cos_sim_accuracy
2215
+ value: 99.82178217821782
2216
+ - type: cos_sim_ap
2217
+ value: 95.41507679819003
2218
+ - type: cos_sim_f1
2219
+ value: 90.9456740442656
2220
+ - type: cos_sim_precision
2221
+ value: 91.49797570850203
2222
+ - type: cos_sim_recall
2223
+ value: 90.4
2224
+ - type: dot_accuracy
2225
+ value: 99.77227722772277
2226
+ - type: dot_ap
2227
+ value: 92.50123869445967
2228
+ - type: dot_f1
2229
+ value: 88.18414322250638
2230
+ - type: dot_precision
2231
+ value: 90.26178010471205
2232
+ - type: dot_recall
2233
+ value: 86.2
2234
+ - type: euclidean_accuracy
2235
+ value: 99.81782178217821
2236
+ - type: euclidean_ap
2237
+ value: 95.3935066749006
2238
+ - type: euclidean_f1
2239
+ value: 90.66128218071681
2240
+ - type: euclidean_precision
2241
+ value: 91.53924566768603
2242
+ - type: euclidean_recall
2243
+ value: 89.8
2244
+ - type: manhattan_accuracy
2245
+ value: 99.81881188118813
2246
+ - type: manhattan_ap
2247
+ value: 95.39767454613512
2248
+ - type: manhattan_f1
2249
+ value: 90.62019477191186
2250
+ - type: manhattan_precision
2251
+ value: 92.95478443743428
2252
+ - type: manhattan_recall
2253
+ value: 88.4
2254
+ - type: max_accuracy
2255
+ value: 99.82178217821782
2256
+ - type: max_ap
2257
+ value: 95.41507679819003
2258
+ - type: max_f1
2259
+ value: 90.9456740442656
2260
+ - task:
2261
+ type: Clustering
2262
+ dataset:
2263
+ type: mteb/stackexchange-clustering
2264
+ name: MTEB StackExchangeClustering
2265
+ config: default
2266
+ split: test
2267
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2268
+ metrics:
2269
+ - type: v_measure
2270
+ value: 64.96313921233748
2271
+ - task:
2272
+ type: Clustering
2273
+ dataset:
2274
+ type: mteb/stackexchange-clustering-p2p
2275
+ name: MTEB StackExchangeClusteringP2P
2276
+ config: default
2277
+ split: test
2278
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2279
+ metrics:
2280
+ - type: v_measure
2281
+ value: 33.602625720956745
2282
+ - task:
2283
+ type: Reranking
2284
+ dataset:
2285
+ type: mteb/stackoverflowdupquestions-reranking
2286
+ name: MTEB StackOverflowDupQuestions
2287
+ config: default
2288
+ split: test
2289
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2290
+ metrics:
2291
+ - type: map
2292
+ value: 51.32659230651731
2293
+ - type: mrr
2294
+ value: 52.33861726508785
2295
+ - task:
2296
+ type: Summarization
2297
+ dataset:
2298
+ type: mteb/summeval
2299
+ name: MTEB SummEval
2300
+ config: default
2301
+ split: test
2302
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2303
+ metrics:
2304
+ - type: cos_sim_pearson
2305
+ value: 25.658532855940212
2306
+ - type: cos_sim_spearman
2307
+ value: 25.202702076359323
2308
+ - type: dot_pearson
2309
+ value: 21.585479641185145
2310
+ - type: dot_spearman
2311
+ value: 23.03461045573253
2312
+ - task:
2313
+ type: Retrieval
2314
+ dataset:
2315
+ type: trec-covid
2316
+ name: MTEB TRECCOVID
2317
+ config: default
2318
+ split: test
2319
+ revision: None
2320
+ metrics:
2321
+ - type: map_at_1
2322
+ value: 0.22
2323
+ - type: map_at_10
2324
+ value: 1.9539999999999997
2325
+ - type: map_at_100
2326
+ value: 11.437
2327
+ - type: map_at_1000
2328
+ value: 27.861000000000004
2329
+ - type: map_at_3
2330
+ value: 0.6479999999999999
2331
+ - type: map_at_5
2332
+ value: 1.0410000000000001
2333
+ - type: mrr_at_1
2334
+ value: 84.0
2335
+ - type: mrr_at_10
2336
+ value: 90.333
2337
+ - type: mrr_at_100
2338
+ value: 90.333
2339
+ - type: mrr_at_1000
2340
+ value: 90.333
2341
+ - type: mrr_at_3
2342
+ value: 90.333
2343
+ - type: mrr_at_5
2344
+ value: 90.333
2345
+ - type: ndcg_at_1
2346
+ value: 80.0
2347
+ - type: ndcg_at_10
2348
+ value: 78.31700000000001
2349
+ - type: ndcg_at_100
2350
+ value: 59.396
2351
+ - type: ndcg_at_1000
2352
+ value: 52.733
2353
+ - type: ndcg_at_3
2354
+ value: 81.46900000000001
2355
+ - type: ndcg_at_5
2356
+ value: 80.74
2357
+ - type: precision_at_1
2358
+ value: 84.0
2359
+ - type: precision_at_10
2360
+ value: 84.0
2361
+ - type: precision_at_100
2362
+ value: 60.980000000000004
2363
+ - type: precision_at_1000
2364
+ value: 23.432
2365
+ - type: precision_at_3
2366
+ value: 87.333
2367
+ - type: precision_at_5
2368
+ value: 86.8
2369
+ - type: recall_at_1
2370
+ value: 0.22
2371
+ - type: recall_at_10
2372
+ value: 2.156
2373
+ - type: recall_at_100
2374
+ value: 14.557999999999998
2375
+ - type: recall_at_1000
2376
+ value: 49.553999999999995
2377
+ - type: recall_at_3
2378
+ value: 0.685
2379
+ - type: recall_at_5
2380
+ value: 1.121
2381
+ - task:
2382
+ type: Retrieval
2383
+ dataset:
2384
+ type: webis-touche2020
2385
+ name: MTEB Touche2020
2386
+ config: default
2387
+ split: test
2388
+ revision: None
2389
+ metrics:
2390
+ - type: map_at_1
2391
+ value: 3.373
2392
+ - type: map_at_10
2393
+ value: 11.701
2394
+ - type: map_at_100
2395
+ value: 17.144000000000002
2396
+ - type: map_at_1000
2397
+ value: 18.624
2398
+ - type: map_at_3
2399
+ value: 6.552
2400
+ - type: map_at_5
2401
+ value: 9.372
2402
+ - type: mrr_at_1
2403
+ value: 38.775999999999996
2404
+ - type: mrr_at_10
2405
+ value: 51.975
2406
+ - type: mrr_at_100
2407
+ value: 52.873999999999995
2408
+ - type: mrr_at_1000
2409
+ value: 52.873999999999995
2410
+ - type: mrr_at_3
2411
+ value: 47.619
2412
+ - type: mrr_at_5
2413
+ value: 50.578
2414
+ - type: ndcg_at_1
2415
+ value: 36.735
2416
+ - type: ndcg_at_10
2417
+ value: 27.212999999999997
2418
+ - type: ndcg_at_100
2419
+ value: 37.245
2420
+ - type: ndcg_at_1000
2421
+ value: 48.602000000000004
2422
+ - type: ndcg_at_3
2423
+ value: 30.916
2424
+ - type: ndcg_at_5
2425
+ value: 30.799
2426
+ - type: precision_at_1
2427
+ value: 38.775999999999996
2428
+ - type: precision_at_10
2429
+ value: 23.469
2430
+ - type: precision_at_100
2431
+ value: 7.327
2432
+ - type: precision_at_1000
2433
+ value: 1.486
2434
+ - type: precision_at_3
2435
+ value: 31.973000000000003
2436
+ - type: precision_at_5
2437
+ value: 32.245000000000005
2438
+ - type: recall_at_1
2439
+ value: 3.373
2440
+ - type: recall_at_10
2441
+ value: 17.404
2442
+ - type: recall_at_100
2443
+ value: 46.105000000000004
2444
+ - type: recall_at_1000
2445
+ value: 80.35
2446
+ - type: recall_at_3
2447
+ value: 7.4399999999999995
2448
+ - type: recall_at_5
2449
+ value: 12.183
2450
+ - task:
2451
+ type: Classification
2452
+ dataset:
2453
+ type: mteb/toxic_conversations_50k
2454
+ name: MTEB ToxicConversationsClassification
2455
+ config: default
2456
+ split: test
2457
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2458
+ metrics:
2459
+ - type: accuracy
2460
+ value: 70.5592
2461
+ - type: ap
2462
+ value: 14.330910591410134
2463
+ - type: f1
2464
+ value: 54.45745186286521
2465
+ - task:
2466
+ type: Classification
2467
+ dataset:
2468
+ type: mteb/tweet_sentiment_extraction
2469
+ name: MTEB TweetSentimentExtractionClassification
2470
+ config: default
2471
+ split: test
2472
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2473
+ metrics:
2474
+ - type: accuracy
2475
+ value: 61.20543293718167
2476
+ - type: f1
2477
+ value: 61.45365480309872
2478
+ - task:
2479
+ type: Clustering
2480
+ dataset:
2481
+ type: mteb/twentynewsgroups-clustering
2482
+ name: MTEB TwentyNewsgroupsClustering
2483
+ config: default
2484
+ split: test
2485
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2486
+ metrics:
2487
+ - type: v_measure
2488
+ value: 43.81162998944145
2489
+ - task:
2490
+ type: PairClassification
2491
+ dataset:
2492
+ type: mteb/twittersemeval2015-pairclassification
2493
+ name: MTEB TwitterSemEval2015
2494
+ config: default
2495
+ split: test
2496
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2497
+ metrics:
2498
+ - type: cos_sim_accuracy
2499
+ value: 86.69011146212075
2500
+ - type: cos_sim_ap
2501
+ value: 76.09792353652536
2502
+ - type: cos_sim_f1
2503
+ value: 70.10202763786646
2504
+ - type: cos_sim_precision
2505
+ value: 68.65671641791045
2506
+ - type: cos_sim_recall
2507
+ value: 71.60949868073878
2508
+ - type: dot_accuracy
2509
+ value: 85.33110806461227
2510
+ - type: dot_ap
2511
+ value: 70.19304383327554
2512
+ - type: dot_f1
2513
+ value: 67.22494202525122
2514
+ - type: dot_precision
2515
+ value: 65.6847935548842
2516
+ - type: dot_recall
2517
+ value: 68.83905013192611
2518
+ - type: euclidean_accuracy
2519
+ value: 86.5410979316922
2520
+ - type: euclidean_ap
2521
+ value: 75.91906915651882
2522
+ - type: euclidean_f1
2523
+ value: 69.6798975672215
2524
+ - type: euclidean_precision
2525
+ value: 67.6865671641791
2526
+ - type: euclidean_recall
2527
+ value: 71.79419525065963
2528
+ - type: manhattan_accuracy
2529
+ value: 86.60070334386363
2530
+ - type: manhattan_ap
2531
+ value: 75.94617413885031
2532
+ - type: manhattan_f1
2533
+ value: 69.52689565780946
2534
+ - type: manhattan_precision
2535
+ value: 68.3312101910828
2536
+ - type: manhattan_recall
2537
+ value: 70.76517150395777
2538
+ - type: max_accuracy
2539
+ value: 86.69011146212075
2540
+ - type: max_ap
2541
+ value: 76.09792353652536
2542
+ - type: max_f1
2543
+ value: 70.10202763786646
2544
+ - task:
2545
+ type: PairClassification
2546
+ dataset:
2547
+ type: mteb/twitterurlcorpus-pairclassification
2548
+ name: MTEB TwitterURLCorpus
2549
+ config: default
2550
+ split: test
2551
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2552
+ metrics:
2553
+ - type: cos_sim_accuracy
2554
+ value: 89.25951798812434
2555
+ - type: cos_sim_ap
2556
+ value: 86.31476416599727
2557
+ - type: cos_sim_f1
2558
+ value: 78.52709971038477
2559
+ - type: cos_sim_precision
2560
+ value: 76.7629972792117
2561
+ - type: cos_sim_recall
2562
+ value: 80.37419156144134
2563
+ - type: dot_accuracy
2564
+ value: 88.03896456708192
2565
+ - type: dot_ap
2566
+ value: 83.26963599196237
2567
+ - type: dot_f1
2568
+ value: 76.72696459492317
2569
+ - type: dot_precision
2570
+ value: 73.56411162133521
2571
+ - type: dot_recall
2572
+ value: 80.17400677548507
2573
+ - type: euclidean_accuracy
2574
+ value: 89.21682772538519
2575
+ - type: euclidean_ap
2576
+ value: 86.29306071289969
2577
+ - type: euclidean_f1
2578
+ value: 78.40827030519554
2579
+ - type: euclidean_precision
2580
+ value: 77.42250243939053
2581
+ - type: euclidean_recall
2582
+ value: 79.41946412072683
2583
+ - type: manhattan_accuracy
2584
+ value: 89.22458959133776
2585
+ - type: manhattan_ap
2586
+ value: 86.2901934710645
2587
+ - type: manhattan_f1
2588
+ value: 78.54211378440453
2589
+ - type: manhattan_precision
2590
+ value: 76.85505858079729
2591
+ - type: manhattan_recall
2592
+ value: 80.30489682784109
2593
+ - type: max_accuracy
2594
+ value: 89.25951798812434
2595
+ - type: max_ap
2596
+ value: 86.31476416599727
2597
+ - type: max_f1
2598
+ value: 78.54211378440453
2599
+ ---
2600
+
2601
+ ## E5-large
2602
+
2603
+ [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
2604
+ Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
2605
+
2606
+ This model has 12 layers and the embedding size is 384.
2607
+
2608
+ ## Usage
2609
+
2610
+ Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
2611
+
2612
+ ```python
2613
+ import torch.nn.functional as F
2614
+
2615
+ from torch import Tensor
2616
+ from transformers import AutoTokenizer, AutoModel
2617
+ from transformers.modeling_outputs import BaseModelOutput
2618
+
2619
+
2620
+ def average_pool(last_hidden_states: Tensor,
2621
+ attention_mask: Tensor) -> Tensor:
2622
+ last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
2623
+ return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
2624
+
2625
+
2626
+ # Each input text should start with "query: " or "passage: ".
2627
+ # For tasks other than retrieval, you can simply use the "query: " prefix.
2628
+ input_texts = ['query: how much protein should a female eat',
2629
+ 'query: summit define',
2630
+ "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
2631
+ "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
2632
+
2633
+ tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-large')
2634
+ model = AutoModel.from_pretrained('intfloat/e5-large')
2635
+
2636
+ # Tokenize the input texts
2637
+ batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
2638
+
2639
+ outputs: BaseModelOutput = model(**batch_dict)
2640
+ embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
2641
+
2642
+ # (Optionally) normalize embeddings
2643
+ embeddings = F.normalize(embeddings, p=2, dim=1)
2644
+ scores = (embeddings[:2] @ embeddings[2:].T) * 100
2645
+ print(scores.tolist())
2646
+ ```
2647
+
2648
+ ## Training Details
2649
+
2650
+ Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
2651
+
2652
+ ## Benchmark Evaluation
2653
+
2654
+ Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
2655
+ on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
2656
+
config.json ADDED
@@ -0,0 +1,25 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "tmp/",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 1024,
11
+ "initializer_range": 0.02,
12
+ "intermediate_size": 4096,
13
+ "layer_norm_eps": 1e-12,
14
+ "max_position_embeddings": 512,
15
+ "model_type": "bert",
16
+ "num_attention_heads": 16,
17
+ "num_hidden_layers": 24,
18
+ "pad_token_id": 0,
19
+ "position_embedding_type": "absolute",
20
+ "torch_dtype": "float32",
21
+ "transformers_version": "4.15.0",
22
+ "type_vocab_size": 2,
23
+ "use_cache": true,
24
+ "vocab_size": 30522
25
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c2895f5b85318ea318234e2f9a5b81b957fdee46a24b196050d5c6df56c83200
3
+ size 1340718961
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "model_max_length": 512, "special_tokens_map_file": null, "name_or_path": "amlt/1101_large_qd_prompt_lr1e4_t001_ft_random_swap_nli/all_kd_ft/checkpoint-6000", "tokenizer_class": "BertTokenizer"}
vocab.txt ADDED
The diff for this file is too large to render. See raw diff