Sentence Similarity
sentence-transformers
PyTorch
Safetensors
Transformers
English
mpnet
fill-mask
feature-extraction
Inference Endpoints
5 papers
Files changed (1) hide show
  1. README.md +2273 -2
README.md CHANGED
@@ -4,6 +4,7 @@ tags:
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
 
7
  language: en
8
  license: apache-2.0
9
  datasets:
@@ -28,7 +29,2277 @@ datasets:
28
  - embedding-data/SPECTER
29
  - embedding-data/PAQ_pairs
30
  - embedding-data/WikiAnswers
31
-
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
32
  ---
33
 
34
 
@@ -93,7 +2364,7 @@ print(sentence_embeddings)
93
 
94
  ## Evaluation Results
95
 
96
- For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/all-mpnet-base-v2)
97
 
98
  ------
99
 
 
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
7
+ - mteb
8
  language: en
9
  license: apache-2.0
10
  datasets:
 
29
  - embedding-data/SPECTER
30
  - embedding-data/PAQ_pairs
31
  - embedding-data/WikiAnswers
32
+ model-index:
33
+ - name: all-mpnet-base-v2
34
+ results:
35
+ - task:
36
+ type: Classification
37
+ dataset:
38
+ type: mteb/amazon_counterfactual
39
+ name: MTEB AmazonCounterfactualClassification (en)
40
+ config: en
41
+ split: test
42
+ revision: 2d8a100785abf0ae21420d2a55b0c56e3e1ea996
43
+ metrics:
44
+ - type: accuracy
45
+ value: 65.26865671641791
46
+ - type: ap
47
+ value: 28.47453420428918
48
+ - type: f1
49
+ value: 59.3470101009448
50
+ - task:
51
+ type: Classification
52
+ dataset:
53
+ type: mteb/amazon_polarity
54
+ name: MTEB AmazonPolarityClassification
55
+ config: default
56
+ split: test
57
+ revision: 80714f8dcf8cefc218ef4f8c5a966dd83f75a0e1
58
+ metrics:
59
+ - type: accuracy
60
+ value: 67.13145
61
+ - type: ap
62
+ value: 61.842060778903786
63
+ - type: f1
64
+ value: 66.79987305640383
65
+ - task:
66
+ type: Classification
67
+ dataset:
68
+ type: mteb/amazon_reviews_multi
69
+ name: MTEB AmazonReviewsClassification (en)
70
+ config: en
71
+ split: test
72
+ revision: c379a6705fec24a2493fa68e011692605f44e119
73
+ metrics:
74
+ - type: accuracy
75
+ value: 31.920000000000005
76
+ - type: f1
77
+ value: 31.2465193896153
78
+ - task:
79
+ type: Retrieval
80
+ dataset:
81
+ type: arguana
82
+ name: MTEB ArguAna
83
+ config: default
84
+ split: test
85
+ revision: 5b3e3697907184a9b77a3c99ee9ea1a9cbb1e4e3
86
+ metrics:
87
+ - type: map_at_1
88
+ value: 23.186
89
+ - type: map_at_10
90
+ value: 37.692
91
+ - type: map_at_100
92
+ value: 38.986
93
+ - type: map_at_1000
94
+ value: 38.991
95
+ - type: map_at_3
96
+ value: 32.622
97
+ - type: map_at_5
98
+ value: 35.004999999999995
99
+ - type: ndcg_at_1
100
+ value: 23.186
101
+ - type: ndcg_at_10
102
+ value: 46.521
103
+ - type: ndcg_at_100
104
+ value: 51.954
105
+ - type: ndcg_at_1000
106
+ value: 52.087
107
+ - type: ndcg_at_3
108
+ value: 35.849
109
+ - type: ndcg_at_5
110
+ value: 40.12
111
+ - type: precision_at_1
112
+ value: 23.186
113
+ - type: precision_at_10
114
+ value: 7.510999999999999
115
+ - type: precision_at_100
116
+ value: 0.9860000000000001
117
+ - type: precision_at_1000
118
+ value: 0.1
119
+ - type: precision_at_3
120
+ value: 15.078
121
+ - type: precision_at_5
122
+ value: 11.110000000000001
123
+ - type: recall_at_1
124
+ value: 23.186
125
+ - type: recall_at_10
126
+ value: 75.107
127
+ - type: recall_at_100
128
+ value: 98.649
129
+ - type: recall_at_1000
130
+ value: 99.644
131
+ - type: recall_at_3
132
+ value: 45.235
133
+ - type: recall_at_5
134
+ value: 55.547999999999995
135
+ - task:
136
+ type: Clustering
137
+ dataset:
138
+ type: mteb/arxiv-clustering-p2p
139
+ name: MTEB ArxivClusteringP2P
140
+ config: default
141
+ split: test
142
+ revision: 0bbdb47bcbe3a90093699aefeed338a0f28a7ee8
143
+ metrics:
144
+ - type: v_measure
145
+ value: 48.37886340922374
146
+ - task:
147
+ type: Clustering
148
+ dataset:
149
+ type: mteb/arxiv-clustering-s2s
150
+ name: MTEB ArxivClusteringS2S
151
+ config: default
152
+ split: test
153
+ revision: b73bd54100e5abfa6e3a23dcafb46fe4d2438dc3
154
+ metrics:
155
+ - type: v_measure
156
+ value: 39.72488615315985
157
+ - task:
158
+ type: Reranking
159
+ dataset:
160
+ type: mteb/askubuntudupquestions-reranking
161
+ name: MTEB AskUbuntuDupQuestions
162
+ config: default
163
+ split: test
164
+ revision: 4d853f94cd57d85ec13805aeeac3ae3e5eb4c49c
165
+ metrics:
166
+ - type: map
167
+ value: 65.85199009344481
168
+ - type: mrr
169
+ value: 78.47700391329201
170
+ - task:
171
+ type: STS
172
+ dataset:
173
+ type: mteb/biosses-sts
174
+ name: MTEB BIOSSES
175
+ config: default
176
+ split: test
177
+ revision: 9ee918f184421b6bd48b78f6c714d86546106103
178
+ metrics:
179
+ - type: cos_sim_pearson
180
+ value: 84.47737119217858
181
+ - type: cos_sim_spearman
182
+ value: 80.43195317854409
183
+ - type: euclidean_pearson
184
+ value: 82.20496332547978
185
+ - type: euclidean_spearman
186
+ value: 80.43195317854409
187
+ - type: manhattan_pearson
188
+ value: 81.4836610720397
189
+ - type: manhattan_spearman
190
+ value: 79.65904400101908
191
+ - task:
192
+ type: Classification
193
+ dataset:
194
+ type: mteb/banking77
195
+ name: MTEB Banking77Classification
196
+ config: default
197
+ split: test
198
+ revision: 44fa15921b4c889113cc5df03dd4901b49161ab7
199
+ metrics:
200
+ - type: accuracy
201
+ value: 81.8603896103896
202
+ - type: f1
203
+ value: 81.28027245637479
204
+ - task:
205
+ type: Clustering
206
+ dataset:
207
+ type: mteb/biorxiv-clustering-p2p
208
+ name: MTEB BiorxivClusteringP2P
209
+ config: default
210
+ split: test
211
+ revision: 11d0121201d1f1f280e8cc8f3d98fb9c4d9f9c55
212
+ metrics:
213
+ - type: v_measure
214
+ value: 39.616605133625185
215
+ - task:
216
+ type: Clustering
217
+ dataset:
218
+ type: mteb/biorxiv-clustering-s2s
219
+ name: MTEB BiorxivClusteringS2S
220
+ config: default
221
+ split: test
222
+ revision: c0fab014e1bcb8d3a5e31b2088972a1e01547dc1
223
+ metrics:
224
+ - type: v_measure
225
+ value: 35.02442407186902
226
+ - task:
227
+ type: Retrieval
228
+ dataset:
229
+ type: BeIR/cqadupstack
230
+ name: MTEB CQADupstackAndroidRetrieval
231
+ config: default
232
+ split: test
233
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
234
+ metrics:
235
+ - type: map_at_1
236
+ value: 36.036
237
+ - type: map_at_10
238
+ value: 49.302
239
+ - type: map_at_100
240
+ value: 50.956
241
+ - type: map_at_1000
242
+ value: 51.080000000000005
243
+ - type: map_at_3
244
+ value: 45.237
245
+ - type: map_at_5
246
+ value: 47.353
247
+ - type: ndcg_at_1
248
+ value: 45.207
249
+ - type: ndcg_at_10
250
+ value: 56.485
251
+ - type: ndcg_at_100
252
+ value: 61.413
253
+ - type: ndcg_at_1000
254
+ value: 62.870000000000005
255
+ - type: ndcg_at_3
256
+ value: 51.346000000000004
257
+ - type: ndcg_at_5
258
+ value: 53.486
259
+ - type: precision_at_1
260
+ value: 45.207
261
+ - type: precision_at_10
262
+ value: 11.144
263
+ - type: precision_at_100
264
+ value: 1.735
265
+ - type: precision_at_1000
266
+ value: 0.22100000000000003
267
+ - type: precision_at_3
268
+ value: 24.94
269
+ - type: precision_at_5
270
+ value: 17.997
271
+ - type: recall_at_1
272
+ value: 36.036
273
+ - type: recall_at_10
274
+ value: 69.191
275
+ - type: recall_at_100
276
+ value: 89.423
277
+ - type: recall_at_1000
278
+ value: 98.425
279
+ - type: recall_at_3
280
+ value: 53.849999999999994
281
+ - type: recall_at_5
282
+ value: 60.107
283
+ - task:
284
+ type: Retrieval
285
+ dataset:
286
+ type: BeIR/cqadupstack
287
+ name: MTEB CQADupstackEnglishRetrieval
288
+ config: default
289
+ split: test
290
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
291
+ metrics:
292
+ - type: map_at_1
293
+ value: 32.92
294
+ - type: map_at_10
295
+ value: 45.739999999999995
296
+ - type: map_at_100
297
+ value: 47.309
298
+ - type: map_at_1000
299
+ value: 47.443000000000005
300
+ - type: map_at_3
301
+ value: 42.154
302
+ - type: map_at_5
303
+ value: 44.207
304
+ - type: ndcg_at_1
305
+ value: 42.229
306
+ - type: ndcg_at_10
307
+ value: 52.288999999999994
308
+ - type: ndcg_at_100
309
+ value: 57.04900000000001
310
+ - type: ndcg_at_1000
311
+ value: 58.788
312
+ - type: ndcg_at_3
313
+ value: 47.531
314
+ - type: ndcg_at_5
315
+ value: 49.861
316
+ - type: precision_at_1
317
+ value: 42.229
318
+ - type: precision_at_10
319
+ value: 10.299
320
+ - type: precision_at_100
321
+ value: 1.68
322
+ - type: precision_at_1000
323
+ value: 0.213
324
+ - type: precision_at_3
325
+ value: 23.673
326
+ - type: precision_at_5
327
+ value: 17.006
328
+ - type: recall_at_1
329
+ value: 32.92
330
+ - type: recall_at_10
331
+ value: 63.865
332
+ - type: recall_at_100
333
+ value: 84.06700000000001
334
+ - type: recall_at_1000
335
+ value: 94.536
336
+ - type: recall_at_3
337
+ value: 49.643
338
+ - type: recall_at_5
339
+ value: 56.119
340
+ - task:
341
+ type: Retrieval
342
+ dataset:
343
+ type: BeIR/cqadupstack
344
+ name: MTEB CQADupstackGamingRetrieval
345
+ config: default
346
+ split: test
347
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
348
+ metrics:
349
+ - type: map_at_1
350
+ value: 40.695
351
+ - type: map_at_10
352
+ value: 53.787
353
+ - type: map_at_100
354
+ value: 54.778000000000006
355
+ - type: map_at_1000
356
+ value: 54.827000000000005
357
+ - type: map_at_3
358
+ value: 50.151999999999994
359
+ - type: map_at_5
360
+ value: 52.207
361
+ - type: ndcg_at_1
362
+ value: 46.52
363
+ - type: ndcg_at_10
364
+ value: 60.026
365
+ - type: ndcg_at_100
366
+ value: 63.81099999999999
367
+ - type: ndcg_at_1000
368
+ value: 64.741
369
+ - type: ndcg_at_3
370
+ value: 53.83
371
+ - type: ndcg_at_5
372
+ value: 56.928999999999995
373
+ - type: precision_at_1
374
+ value: 46.52
375
+ - type: precision_at_10
376
+ value: 9.754999999999999
377
+ - type: precision_at_100
378
+ value: 1.2670000000000001
379
+ - type: precision_at_1000
380
+ value: 0.13799999999999998
381
+ - type: precision_at_3
382
+ value: 24.096
383
+ - type: precision_at_5
384
+ value: 16.689999999999998
385
+ - type: recall_at_1
386
+ value: 40.695
387
+ - type: recall_at_10
388
+ value: 75.181
389
+ - type: recall_at_100
390
+ value: 91.479
391
+ - type: recall_at_1000
392
+ value: 98.06899999999999
393
+ - type: recall_at_3
394
+ value: 58.707
395
+ - type: recall_at_5
396
+ value: 66.295
397
+ - task:
398
+ type: Retrieval
399
+ dataset:
400
+ type: BeIR/cqadupstack
401
+ name: MTEB CQADupstackGisRetrieval
402
+ config: default
403
+ split: test
404
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
405
+ metrics:
406
+ - type: map_at_1
407
+ value: 29.024
408
+ - type: map_at_10
409
+ value: 38.438
410
+ - type: map_at_100
411
+ value: 39.576
412
+ - type: map_at_1000
413
+ value: 39.645
414
+ - type: map_at_3
415
+ value: 34.827999999999996
416
+ - type: map_at_5
417
+ value: 36.947
418
+ - type: ndcg_at_1
419
+ value: 31.299
420
+ - type: ndcg_at_10
421
+ value: 44.268
422
+ - type: ndcg_at_100
423
+ value: 49.507
424
+ - type: ndcg_at_1000
425
+ value: 51.205999999999996
426
+ - type: ndcg_at_3
427
+ value: 37.248999999999995
428
+ - type: ndcg_at_5
429
+ value: 40.861999999999995
430
+ - type: precision_at_1
431
+ value: 31.299
432
+ - type: precision_at_10
433
+ value: 6.949
434
+ - type: precision_at_100
435
+ value: 1.012
436
+ - type: precision_at_1000
437
+ value: 0.11900000000000001
438
+ - type: precision_at_3
439
+ value: 15.518
440
+ - type: precision_at_5
441
+ value: 11.366999999999999
442
+ - type: recall_at_1
443
+ value: 29.024
444
+ - type: recall_at_10
445
+ value: 60.404
446
+ - type: recall_at_100
447
+ value: 83.729
448
+ - type: recall_at_1000
449
+ value: 96.439
450
+ - type: recall_at_3
451
+ value: 41.65
452
+ - type: recall_at_5
453
+ value: 50.263999999999996
454
+ - task:
455
+ type: Retrieval
456
+ dataset:
457
+ type: BeIR/cqadupstack
458
+ name: MTEB CQADupstackMathematicaRetrieval
459
+ config: default
460
+ split: test
461
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
462
+ metrics:
463
+ - type: map_at_1
464
+ value: 17.774
465
+ - type: map_at_10
466
+ value: 28.099
467
+ - type: map_at_100
468
+ value: 29.603
469
+ - type: map_at_1000
470
+ value: 29.709999999999997
471
+ - type: map_at_3
472
+ value: 25.036
473
+ - type: map_at_5
474
+ value: 26.657999999999998
475
+ - type: ndcg_at_1
476
+ value: 22.139
477
+ - type: ndcg_at_10
478
+ value: 34.205999999999996
479
+ - type: ndcg_at_100
480
+ value: 40.844
481
+ - type: ndcg_at_1000
482
+ value: 43.144
483
+ - type: ndcg_at_3
484
+ value: 28.732999999999997
485
+ - type: ndcg_at_5
486
+ value: 31.252000000000002
487
+ - type: precision_at_1
488
+ value: 22.139
489
+ - type: precision_at_10
490
+ value: 6.567
491
+ - type: precision_at_100
492
+ value: 1.147
493
+ - type: precision_at_1000
494
+ value: 0.146
495
+ - type: precision_at_3
496
+ value: 14.386
497
+ - type: precision_at_5
498
+ value: 10.423
499
+ - type: recall_at_1
500
+ value: 17.774
501
+ - type: recall_at_10
502
+ value: 48.32
503
+ - type: recall_at_100
504
+ value: 76.373
505
+ - type: recall_at_1000
506
+ value: 92.559
507
+ - type: recall_at_3
508
+ value: 33.478
509
+ - type: recall_at_5
510
+ value: 39.872
511
+ - task:
512
+ type: Retrieval
513
+ dataset:
514
+ type: BeIR/cqadupstack
515
+ name: MTEB CQADupstackPhysicsRetrieval
516
+ config: default
517
+ split: test
518
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
519
+ metrics:
520
+ - type: map_at_1
521
+ value: 31.885
522
+ - type: map_at_10
523
+ value: 44.289
524
+ - type: map_at_100
525
+ value: 45.757999999999996
526
+ - type: map_at_1000
527
+ value: 45.86
528
+ - type: map_at_3
529
+ value: 40.459
530
+ - type: map_at_5
531
+ value: 42.662
532
+ - type: ndcg_at_1
533
+ value: 39.75
534
+ - type: ndcg_at_10
535
+ value: 50.975
536
+ - type: ndcg_at_100
537
+ value: 56.528999999999996
538
+ - type: ndcg_at_1000
539
+ value: 58.06099999999999
540
+ - type: ndcg_at_3
541
+ value: 45.327
542
+ - type: ndcg_at_5
543
+ value: 48.041
544
+ - type: precision_at_1
545
+ value: 39.75
546
+ - type: precision_at_10
547
+ value: 9.557
548
+ - type: precision_at_100
549
+ value: 1.469
550
+ - type: precision_at_1000
551
+ value: 0.17700000000000002
552
+ - type: precision_at_3
553
+ value: 22.073
554
+ - type: precision_at_5
555
+ value: 15.765
556
+ - type: recall_at_1
557
+ value: 31.885
558
+ - type: recall_at_10
559
+ value: 64.649
560
+ - type: recall_at_100
561
+ value: 87.702
562
+ - type: recall_at_1000
563
+ value: 97.327
564
+ - type: recall_at_3
565
+ value: 48.61
566
+ - type: recall_at_5
567
+ value: 55.882
568
+ - task:
569
+ type: Retrieval
570
+ dataset:
571
+ type: BeIR/cqadupstack
572
+ name: MTEB CQADupstackProgrammersRetrieval
573
+ config: default
574
+ split: test
575
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
576
+ metrics:
577
+ - type: map_at_1
578
+ value: 26.454
579
+ - type: map_at_10
580
+ value: 37.756
581
+ - type: map_at_100
582
+ value: 39.225
583
+ - type: map_at_1000
584
+ value: 39.332
585
+ - type: map_at_3
586
+ value: 34.115
587
+ - type: map_at_5
588
+ value: 35.942
589
+ - type: ndcg_at_1
590
+ value: 32.42
591
+ - type: ndcg_at_10
592
+ value: 44.165
593
+ - type: ndcg_at_100
594
+ value: 50.202000000000005
595
+ - type: ndcg_at_1000
596
+ value: 52.188
597
+ - type: ndcg_at_3
598
+ value: 38.381
599
+ - type: ndcg_at_5
600
+ value: 40.849000000000004
601
+ - type: precision_at_1
602
+ value: 32.42
603
+ - type: precision_at_10
604
+ value: 8.482000000000001
605
+ - type: precision_at_100
606
+ value: 1.332
607
+ - type: precision_at_1000
608
+ value: 0.169
609
+ - type: precision_at_3
610
+ value: 18.683
611
+ - type: precision_at_5
612
+ value: 13.539000000000001
613
+ - type: recall_at_1
614
+ value: 26.454
615
+ - type: recall_at_10
616
+ value: 57.937000000000005
617
+ - type: recall_at_100
618
+ value: 83.76
619
+ - type: recall_at_1000
620
+ value: 96.82600000000001
621
+ - type: recall_at_3
622
+ value: 41.842
623
+ - type: recall_at_5
624
+ value: 48.285
625
+ - task:
626
+ type: Retrieval
627
+ dataset:
628
+ type: BeIR/cqadupstack
629
+ name: MTEB CQADupstackRetrieval
630
+ config: default
631
+ split: test
632
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
633
+ metrics:
634
+ - type: map_at_1
635
+ value: 27.743666666666666
636
+ - type: map_at_10
637
+ value: 38.75416666666667
638
+ - type: map_at_100
639
+ value: 40.133250000000004
640
+ - type: map_at_1000
641
+ value: 40.24616666666667
642
+ - type: map_at_3
643
+ value: 35.267250000000004
644
+ - type: map_at_5
645
+ value: 37.132749999999994
646
+ - type: ndcg_at_1
647
+ value: 33.14358333333333
648
+ - type: ndcg_at_10
649
+ value: 44.95916666666667
650
+ - type: ndcg_at_100
651
+ value: 50.46375
652
+ - type: ndcg_at_1000
653
+ value: 52.35508333333334
654
+ - type: ndcg_at_3
655
+ value: 39.17883333333334
656
+ - type: ndcg_at_5
657
+ value: 41.79724999999999
658
+ - type: precision_at_1
659
+ value: 33.14358333333333
660
+ - type: precision_at_10
661
+ value: 8.201083333333333
662
+ - type: precision_at_100
663
+ value: 1.3085
664
+ - type: precision_at_1000
665
+ value: 0.1665833333333333
666
+ - type: precision_at_3
667
+ value: 18.405583333333333
668
+ - type: precision_at_5
669
+ value: 13.233166666666666
670
+ - type: recall_at_1
671
+ value: 27.743666666666666
672
+ - type: recall_at_10
673
+ value: 58.91866666666667
674
+ - type: recall_at_100
675
+ value: 82.76216666666666
676
+ - type: recall_at_1000
677
+ value: 95.56883333333333
678
+ - type: recall_at_3
679
+ value: 42.86925
680
+ - type: recall_at_5
681
+ value: 49.553333333333335
682
+ - task:
683
+ type: Retrieval
684
+ dataset:
685
+ type: BeIR/cqadupstack
686
+ name: MTEB CQADupstackStatsRetrieval
687
+ config: default
688
+ split: test
689
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
690
+ metrics:
691
+ - type: map_at_1
692
+ value: 25.244
693
+ - type: map_at_10
694
+ value: 33.464
695
+ - type: map_at_100
696
+ value: 34.633
697
+ - type: map_at_1000
698
+ value: 34.721999999999994
699
+ - type: map_at_3
700
+ value: 30.784
701
+ - type: map_at_5
702
+ value: 32.183
703
+ - type: ndcg_at_1
704
+ value: 28.681
705
+ - type: ndcg_at_10
706
+ value: 38.149
707
+ - type: ndcg_at_100
708
+ value: 43.856
709
+ - type: ndcg_at_1000
710
+ value: 46.026
711
+ - type: ndcg_at_3
712
+ value: 33.318
713
+ - type: ndcg_at_5
714
+ value: 35.454
715
+ - type: precision_at_1
716
+ value: 28.681
717
+ - type: precision_at_10
718
+ value: 6.304
719
+ - type: precision_at_100
720
+ value: 0.992
721
+ - type: precision_at_1000
722
+ value: 0.125
723
+ - type: precision_at_3
724
+ value: 14.673
725
+ - type: precision_at_5
726
+ value: 10.245
727
+ - type: recall_at_1
728
+ value: 25.244
729
+ - type: recall_at_10
730
+ value: 49.711
731
+ - type: recall_at_100
732
+ value: 75.928
733
+ - type: recall_at_1000
734
+ value: 91.79899999999999
735
+ - type: recall_at_3
736
+ value: 36.325
737
+ - type: recall_at_5
738
+ value: 41.752
739
+ - task:
740
+ type: Retrieval
741
+ dataset:
742
+ type: BeIR/cqadupstack
743
+ name: MTEB CQADupstackTexRetrieval
744
+ config: default
745
+ split: test
746
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
747
+ metrics:
748
+ - type: map_at_1
749
+ value: 18.857
750
+ - type: map_at_10
751
+ value: 27.794
752
+ - type: map_at_100
753
+ value: 29.186
754
+ - type: map_at_1000
755
+ value: 29.323
756
+ - type: map_at_3
757
+ value: 24.779
758
+ - type: map_at_5
759
+ value: 26.459
760
+ - type: ndcg_at_1
761
+ value: 23.227999999999998
762
+ - type: ndcg_at_10
763
+ value: 33.353
764
+ - type: ndcg_at_100
765
+ value: 39.598
766
+ - type: ndcg_at_1000
767
+ value: 42.268
768
+ - type: ndcg_at_3
769
+ value: 28.054000000000002
770
+ - type: ndcg_at_5
771
+ value: 30.566
772
+ - type: precision_at_1
773
+ value: 23.227999999999998
774
+ - type: precision_at_10
775
+ value: 6.397
776
+ - type: precision_at_100
777
+ value: 1.129
778
+ - type: precision_at_1000
779
+ value: 0.155
780
+ - type: precision_at_3
781
+ value: 13.616
782
+ - type: precision_at_5
783
+ value: 10.116999999999999
784
+ - type: recall_at_1
785
+ value: 18.857
786
+ - type: recall_at_10
787
+ value: 45.797
788
+ - type: recall_at_100
789
+ value: 73.615
790
+ - type: recall_at_1000
791
+ value: 91.959
792
+ - type: recall_at_3
793
+ value: 31.129
794
+ - type: recall_at_5
795
+ value: 37.565
796
+ - task:
797
+ type: Retrieval
798
+ dataset:
799
+ type: BeIR/cqadupstack
800
+ name: MTEB CQADupstackUnixRetrieval
801
+ config: default
802
+ split: test
803
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
804
+ metrics:
805
+ - type: map_at_1
806
+ value: 27.486
807
+ - type: map_at_10
808
+ value: 39.164
809
+ - type: map_at_100
810
+ value: 40.543
811
+ - type: map_at_1000
812
+ value: 40.636
813
+ - type: map_at_3
814
+ value: 35.52
815
+ - type: map_at_5
816
+ value: 37.355
817
+ - type: ndcg_at_1
818
+ value: 32.275999999999996
819
+ - type: ndcg_at_10
820
+ value: 45.414
821
+ - type: ndcg_at_100
822
+ value: 51.254
823
+ - type: ndcg_at_1000
824
+ value: 53.044000000000004
825
+ - type: ndcg_at_3
826
+ value: 39.324999999999996
827
+ - type: ndcg_at_5
828
+ value: 41.835
829
+ - type: precision_at_1
830
+ value: 32.275999999999996
831
+ - type: precision_at_10
832
+ value: 8.144
833
+ - type: precision_at_100
834
+ value: 1.237
835
+ - type: precision_at_1000
836
+ value: 0.15
837
+ - type: precision_at_3
838
+ value: 18.501
839
+ - type: precision_at_5
840
+ value: 13.134
841
+ - type: recall_at_1
842
+ value: 27.486
843
+ - type: recall_at_10
844
+ value: 60.449
845
+ - type: recall_at_100
846
+ value: 85.176
847
+ - type: recall_at_1000
848
+ value: 97.087
849
+ - type: recall_at_3
850
+ value: 43.59
851
+ - type: recall_at_5
852
+ value: 50.08899999999999
853
+ - task:
854
+ type: Retrieval
855
+ dataset:
856
+ type: BeIR/cqadupstack
857
+ name: MTEB CQADupstackWebmastersRetrieval
858
+ config: default
859
+ split: test
860
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
861
+ metrics:
862
+ - type: map_at_1
863
+ value: 26.207
864
+ - type: map_at_10
865
+ value: 37.255
866
+ - type: map_at_100
867
+ value: 39.043
868
+ - type: map_at_1000
869
+ value: 39.273
870
+ - type: map_at_3
871
+ value: 33.487
872
+ - type: map_at_5
873
+ value: 35.441
874
+ - type: ndcg_at_1
875
+ value: 31.423000000000002
876
+ - type: ndcg_at_10
877
+ value: 44.235
878
+ - type: ndcg_at_100
879
+ value: 50.49
880
+ - type: ndcg_at_1000
881
+ value: 52.283
882
+ - type: ndcg_at_3
883
+ value: 37.602000000000004
884
+ - type: ndcg_at_5
885
+ value: 40.518
886
+ - type: precision_at_1
887
+ value: 31.423000000000002
888
+ - type: precision_at_10
889
+ value: 8.715
890
+ - type: precision_at_100
891
+ value: 1.7590000000000001
892
+ - type: precision_at_1000
893
+ value: 0.257
894
+ - type: precision_at_3
895
+ value: 17.523
896
+ - type: precision_at_5
897
+ value: 13.161999999999999
898
+ - type: recall_at_1
899
+ value: 26.207
900
+ - type: recall_at_10
901
+ value: 59.17099999999999
902
+ - type: recall_at_100
903
+ value: 86.166
904
+ - type: recall_at_1000
905
+ value: 96.54700000000001
906
+ - type: recall_at_3
907
+ value: 41.18
908
+ - type: recall_at_5
909
+ value: 48.083999999999996
910
+ - task:
911
+ type: Retrieval
912
+ dataset:
913
+ type: BeIR/cqadupstack
914
+ name: MTEB CQADupstackWordpressRetrieval
915
+ config: default
916
+ split: test
917
+ revision: 2b9f5791698b5be7bc5e10535c8690f20043c3db
918
+ metrics:
919
+ - type: map_at_1
920
+ value: 20.342
921
+ - type: map_at_10
922
+ value: 29.962
923
+ - type: map_at_100
924
+ value: 30.989
925
+ - type: map_at_1000
926
+ value: 31.102999999999998
927
+ - type: map_at_3
928
+ value: 26.656000000000002
929
+ - type: map_at_5
930
+ value: 28.179
931
+ - type: ndcg_at_1
932
+ value: 22.551
933
+ - type: ndcg_at_10
934
+ value: 35.945
935
+ - type: ndcg_at_100
936
+ value: 41.012
937
+ - type: ndcg_at_1000
938
+ value: 43.641999999999996
939
+ - type: ndcg_at_3
940
+ value: 29.45
941
+ - type: ndcg_at_5
942
+ value: 31.913999999999998
943
+ - type: precision_at_1
944
+ value: 22.551
945
+ - type: precision_at_10
946
+ value: 6.1
947
+ - type: precision_at_100
948
+ value: 0.943
949
+ - type: precision_at_1000
950
+ value: 0.129
951
+ - type: precision_at_3
952
+ value: 13.184999999999999
953
+ - type: precision_at_5
954
+ value: 9.353
955
+ - type: recall_at_1
956
+ value: 20.342
957
+ - type: recall_at_10
958
+ value: 52.349000000000004
959
+ - type: recall_at_100
960
+ value: 75.728
961
+ - type: recall_at_1000
962
+ value: 95.253
963
+ - type: recall_at_3
964
+ value: 34.427
965
+ - type: recall_at_5
966
+ value: 40.326
967
+ - task:
968
+ type: Retrieval
969
+ dataset:
970
+ type: climate-fever
971
+ name: MTEB ClimateFEVER
972
+ config: default
973
+ split: test
974
+ revision: 392b78eb68c07badcd7c2cd8f39af108375dfcce
975
+ metrics:
976
+ - type: map_at_1
977
+ value: 7.71
978
+ - type: map_at_10
979
+ value: 14.81
980
+ - type: map_at_100
981
+ value: 16.536
982
+ - type: map_at_1000
983
+ value: 16.744999999999997
984
+ - type: map_at_3
985
+ value: 12.109
986
+ - type: map_at_5
987
+ value: 13.613
988
+ - type: ndcg_at_1
989
+ value: 18.046
990
+ - type: ndcg_at_10
991
+ value: 21.971
992
+ - type: ndcg_at_100
993
+ value: 29.468
994
+ - type: ndcg_at_1000
995
+ value: 33.428999999999995
996
+ - type: ndcg_at_3
997
+ value: 17.227999999999998
998
+ - type: ndcg_at_5
999
+ value: 19.189999999999998
1000
+ - type: precision_at_1
1001
+ value: 18.046
1002
+ - type: precision_at_10
1003
+ value: 7.192
1004
+ - type: precision_at_100
1005
+ value: 1.51
1006
+ - type: precision_at_1000
1007
+ value: 0.22499999999999998
1008
+ - type: precision_at_3
1009
+ value: 13.312
1010
+ - type: precision_at_5
1011
+ value: 10.775
1012
+ - type: recall_at_1
1013
+ value: 7.71
1014
+ - type: recall_at_10
1015
+ value: 27.908
1016
+ - type: recall_at_100
1017
+ value: 54.452
1018
+ - type: recall_at_1000
1019
+ value: 76.764
1020
+ - type: recall_at_3
1021
+ value: 16.64
1022
+ - type: recall_at_5
1023
+ value: 21.631
1024
+ - task:
1025
+ type: Retrieval
1026
+ dataset:
1027
+ type: dbpedia-entity
1028
+ name: MTEB DBPedia
1029
+ config: default
1030
+ split: test
1031
+ revision: f097057d03ed98220bc7309ddb10b71a54d667d6
1032
+ metrics:
1033
+ - type: map_at_1
1034
+ value: 6.8180000000000005
1035
+ - type: map_at_10
1036
+ value: 14.591000000000001
1037
+ - type: map_at_100
1038
+ value: 19.855999999999998
1039
+ - type: map_at_1000
1040
+ value: 21.178
1041
+ - type: map_at_3
1042
+ value: 10.345
1043
+ - type: map_at_5
1044
+ value: 12.367
1045
+ - type: ndcg_at_1
1046
+ value: 39.25
1047
+ - type: ndcg_at_10
1048
+ value: 32.088
1049
+ - type: ndcg_at_100
1050
+ value: 36.019
1051
+ - type: ndcg_at_1000
1052
+ value: 43.649
1053
+ - type: ndcg_at_3
1054
+ value: 35.132999999999996
1055
+ - type: ndcg_at_5
1056
+ value: 33.777
1057
+ - type: precision_at_1
1058
+ value: 49.5
1059
+ - type: precision_at_10
1060
+ value: 25.624999999999996
1061
+ - type: precision_at_100
1062
+ value: 8.043
1063
+ - type: precision_at_1000
1064
+ value: 1.7409999999999999
1065
+ - type: precision_at_3
1066
+ value: 38.417
1067
+ - type: precision_at_5
1068
+ value: 33.2
1069
+ - type: recall_at_1
1070
+ value: 6.8180000000000005
1071
+ - type: recall_at_10
1072
+ value: 20.399
1073
+ - type: recall_at_100
1074
+ value: 42.8
1075
+ - type: recall_at_1000
1076
+ value: 68.081
1077
+ - type: recall_at_3
1078
+ value: 11.928999999999998
1079
+ - type: recall_at_5
1080
+ value: 15.348999999999998
1081
+ - task:
1082
+ type: Classification
1083
+ dataset:
1084
+ type: mteb/emotion
1085
+ name: MTEB EmotionClassification
1086
+ config: default
1087
+ split: test
1088
+ revision: 829147f8f75a25f005913200eb5ed41fae320aa1
1089
+ metrics:
1090
+ - type: accuracy
1091
+ value: 39.725
1092
+ - type: f1
1093
+ value: 35.19385687310605
1094
+ - task:
1095
+ type: Retrieval
1096
+ dataset:
1097
+ type: fever
1098
+ name: MTEB FEVER
1099
+ config: default
1100
+ split: test
1101
+ revision: 1429cf27e393599b8b359b9b72c666f96b2525f9
1102
+ metrics:
1103
+ - type: map_at_1
1104
+ value: 31.901000000000003
1105
+ - type: map_at_10
1106
+ value: 44.156
1107
+ - type: map_at_100
1108
+ value: 44.901
1109
+ - type: map_at_1000
1110
+ value: 44.939
1111
+ - type: map_at_3
1112
+ value: 41.008
1113
+ - type: map_at_5
1114
+ value: 42.969
1115
+ - type: ndcg_at_1
1116
+ value: 34.263
1117
+ - type: ndcg_at_10
1118
+ value: 50.863
1119
+ - type: ndcg_at_100
1120
+ value: 54.336
1121
+ - type: ndcg_at_1000
1122
+ value: 55.297
1123
+ - type: ndcg_at_3
1124
+ value: 44.644
1125
+ - type: ndcg_at_5
1126
+ value: 48.075
1127
+ - type: precision_at_1
1128
+ value: 34.263
1129
+ - type: precision_at_10
1130
+ value: 7.542999999999999
1131
+ - type: precision_at_100
1132
+ value: 0.9400000000000001
1133
+ - type: precision_at_1000
1134
+ value: 0.104
1135
+ - type: precision_at_3
1136
+ value: 18.912000000000003
1137
+ - type: precision_at_5
1138
+ value: 13.177
1139
+ - type: recall_at_1
1140
+ value: 31.901000000000003
1141
+ - type: recall_at_10
1142
+ value: 68.872
1143
+ - type: recall_at_100
1144
+ value: 84.468
1145
+ - type: recall_at_1000
1146
+ value: 91.694
1147
+ - type: recall_at_3
1148
+ value: 52.272
1149
+ - type: recall_at_5
1150
+ value: 60.504999999999995
1151
+ - task:
1152
+ type: Retrieval
1153
+ dataset:
1154
+ type: fiqa
1155
+ name: MTEB FiQA2018
1156
+ config: default
1157
+ split: test
1158
+ revision: 41b686a7f28c59bcaaa5791efd47c67c8ebe28be
1159
+ metrics:
1160
+ - type: map_at_1
1161
+ value: 24.4
1162
+ - type: map_at_10
1163
+ value: 41.117
1164
+ - type: map_at_100
1165
+ value: 43.167
1166
+ - type: map_at_1000
1167
+ value: 43.323
1168
+ - type: map_at_3
1169
+ value: 35.744
1170
+ - type: map_at_5
1171
+ value: 38.708
1172
+ - type: ndcg_at_1
1173
+ value: 49.074
1174
+ - type: ndcg_at_10
1175
+ value: 49.963
1176
+ - type: ndcg_at_100
1177
+ value: 56.564
1178
+ - type: ndcg_at_1000
1179
+ value: 58.931999999999995
1180
+ - type: ndcg_at_3
1181
+ value: 45.489000000000004
1182
+ - type: ndcg_at_5
1183
+ value: 47.133
1184
+ - type: precision_at_1
1185
+ value: 49.074
1186
+ - type: precision_at_10
1187
+ value: 13.889000000000001
1188
+ - type: precision_at_100
1189
+ value: 2.091
1190
+ - type: precision_at_1000
1191
+ value: 0.251
1192
+ - type: precision_at_3
1193
+ value: 30.658
1194
+ - type: precision_at_5
1195
+ value: 22.593
1196
+ - type: recall_at_1
1197
+ value: 24.4
1198
+ - type: recall_at_10
1199
+ value: 58.111999999999995
1200
+ - type: recall_at_100
1201
+ value: 81.96900000000001
1202
+ - type: recall_at_1000
1203
+ value: 96.187
1204
+ - type: recall_at_3
1205
+ value: 41.661
1206
+ - type: recall_at_5
1207
+ value: 49.24
1208
+ - task:
1209
+ type: Retrieval
1210
+ dataset:
1211
+ type: hotpotqa
1212
+ name: MTEB HotpotQA
1213
+ config: default
1214
+ split: test
1215
+ revision: 766870b35a1b9ca65e67a0d1913899973551fc6c
1216
+ metrics:
1217
+ - type: map_at_1
1218
+ value: 22.262
1219
+ - type: map_at_10
1220
+ value: 31.266
1221
+ - type: map_at_100
1222
+ value: 32.202
1223
+ - type: map_at_1000
1224
+ value: 32.300000000000004
1225
+ - type: map_at_3
1226
+ value: 28.874
1227
+ - type: map_at_5
1228
+ value: 30.246000000000002
1229
+ - type: ndcg_at_1
1230
+ value: 44.524
1231
+ - type: ndcg_at_10
1232
+ value: 39.294000000000004
1233
+ - type: ndcg_at_100
1234
+ value: 43.296
1235
+ - type: ndcg_at_1000
1236
+ value: 45.561
1237
+ - type: ndcg_at_3
1238
+ value: 35.013
1239
+ - type: ndcg_at_5
1240
+ value: 37.177
1241
+ - type: precision_at_1
1242
+ value: 44.524
1243
+ - type: precision_at_10
1244
+ value: 8.52
1245
+ - type: precision_at_100
1246
+ value: 1.169
1247
+ - type: precision_at_1000
1248
+ value: 0.147
1249
+ - type: precision_at_3
1250
+ value: 22.003
1251
+ - type: precision_at_5
1252
+ value: 14.914
1253
+ - type: recall_at_1
1254
+ value: 22.262
1255
+ - type: recall_at_10
1256
+ value: 42.6
1257
+ - type: recall_at_100
1258
+ value: 58.46
1259
+ - type: recall_at_1000
1260
+ value: 73.565
1261
+ - type: recall_at_3
1262
+ value: 33.005
1263
+ - type: recall_at_5
1264
+ value: 37.286
1265
+ - task:
1266
+ type: Classification
1267
+ dataset:
1268
+ type: mteb/imdb
1269
+ name: MTEB ImdbClassification
1270
+ config: default
1271
+ split: test
1272
+ revision: 8d743909f834c38949e8323a8a6ce8721ea6c7f4
1273
+ metrics:
1274
+ - type: accuracy
1275
+ value: 70.7156
1276
+ - type: ap
1277
+ value: 64.89470531959896
1278
+ - type: f1
1279
+ value: 70.53051887683772
1280
+ - task:
1281
+ type: Retrieval
1282
+ dataset:
1283
+ type: msmarco
1284
+ name: MTEB MSMARCO
1285
+ config: default
1286
+ split: dev
1287
+ revision: e6838a846e2408f22cf5cc337ebc83e0bcf77849
1288
+ metrics:
1289
+ - type: map_at_1
1290
+ value: 21.174
1291
+ - type: map_at_10
1292
+ value: 33.0
1293
+ - type: map_at_100
1294
+ value: 34.178
1295
+ - type: map_at_1000
1296
+ value: 34.227000000000004
1297
+ - type: map_at_3
1298
+ value: 29.275000000000002
1299
+ - type: map_at_5
1300
+ value: 31.341
1301
+ - type: ndcg_at_1
1302
+ value: 21.776999999999997
1303
+ - type: ndcg_at_10
1304
+ value: 39.745999999999995
1305
+ - type: ndcg_at_100
1306
+ value: 45.488
1307
+ - type: ndcg_at_1000
1308
+ value: 46.733999999999995
1309
+ - type: ndcg_at_3
1310
+ value: 32.086
1311
+ - type: ndcg_at_5
1312
+ value: 35.792
1313
+ - type: precision_at_1
1314
+ value: 21.776999999999997
1315
+ - type: precision_at_10
1316
+ value: 6.324000000000001
1317
+ - type: precision_at_100
1318
+ value: 0.922
1319
+ - type: precision_at_1000
1320
+ value: 0.10300000000000001
1321
+ - type: precision_at_3
1322
+ value: 13.696
1323
+ - type: precision_at_5
1324
+ value: 10.100000000000001
1325
+ - type: recall_at_1
1326
+ value: 21.174
1327
+ - type: recall_at_10
1328
+ value: 60.488
1329
+ - type: recall_at_100
1330
+ value: 87.234
1331
+ - type: recall_at_1000
1332
+ value: 96.806
1333
+ - type: recall_at_3
1334
+ value: 39.582
1335
+ - type: recall_at_5
1336
+ value: 48.474000000000004
1337
+ - task:
1338
+ type: Classification
1339
+ dataset:
1340
+ type: mteb/mtop_domain
1341
+ name: MTEB MTOPDomainClassification (en)
1342
+ config: en
1343
+ split: test
1344
+ revision: a7e2a951126a26fc8c6a69f835f33a346ba259e3
1345
+ metrics:
1346
+ - type: accuracy
1347
+ value: 92.07934336525308
1348
+ - type: f1
1349
+ value: 91.93440027035814
1350
+ - task:
1351
+ type: Classification
1352
+ dataset:
1353
+ type: mteb/mtop_intent
1354
+ name: MTEB MTOPIntentClassification (en)
1355
+ config: en
1356
+ split: test
1357
+ revision: 6299947a7777084cc2d4b64235bf7190381ce755
1358
+ metrics:
1359
+ - type: accuracy
1360
+ value: 70.20975832193344
1361
+ - type: f1
1362
+ value: 48.571776628850074
1363
+ - task:
1364
+ type: Classification
1365
+ dataset:
1366
+ type: mteb/amazon_massive_intent
1367
+ name: MTEB MassiveIntentClassification (en)
1368
+ config: en
1369
+ split: test
1370
+ revision: 072a486a144adf7f4479a4a0dddb2152e161e1ea
1371
+ metrics:
1372
+ - type: accuracy
1373
+ value: 69.56624075319435
1374
+ - type: f1
1375
+ value: 67.64419185784621
1376
+ - task:
1377
+ type: Classification
1378
+ dataset:
1379
+ type: mteb/amazon_massive_scenario
1380
+ name: MTEB MassiveScenarioClassification (en)
1381
+ config: en
1382
+ split: test
1383
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1384
+ metrics:
1385
+ - type: accuracy
1386
+ value: 76.01210490921318
1387
+ - type: f1
1388
+ value: 75.1934366365826
1389
+ - task:
1390
+ type: Clustering
1391
+ dataset:
1392
+ type: mteb/medrxiv-clustering-p2p
1393
+ name: MTEB MedrxivClusteringP2P
1394
+ config: default
1395
+ split: test
1396
+ revision: dcefc037ef84348e49b0d29109e891c01067226b
1397
+ metrics:
1398
+ - type: v_measure
1399
+ value: 35.58002813186373
1400
+ - task:
1401
+ type: Clustering
1402
+ dataset:
1403
+ type: mteb/medrxiv-clustering-s2s
1404
+ name: MTEB MedrxivClusteringS2S
1405
+ config: default
1406
+ split: test
1407
+ revision: 3cd0e71dfbe09d4de0f9e5ecba43e7ce280959dc
1408
+ metrics:
1409
+ - type: v_measure
1410
+ value: 32.872725562410444
1411
+ - task:
1412
+ type: Reranking
1413
+ dataset:
1414
+ type: mteb/mind_small
1415
+ name: MTEB MindSmallReranking
1416
+ config: default
1417
+ split: test
1418
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1419
+ metrics:
1420
+ - type: map
1421
+ value: 30.965343604861328
1422
+ - type: mrr
1423
+ value: 31.933710165863594
1424
+ - task:
1425
+ type: Retrieval
1426
+ dataset:
1427
+ type: nfcorpus
1428
+ name: MTEB NFCorpus
1429
+ config: default
1430
+ split: test
1431
+ revision: 7eb63cc0c1eb59324d709ebed25fcab851fa7610
1432
+ metrics:
1433
+ - type: map_at_1
1434
+ value: 4.938
1435
+ - type: map_at_10
1436
+ value: 12.034
1437
+ - type: map_at_100
1438
+ value: 15.675
1439
+ - type: map_at_1000
1440
+ value: 17.18
1441
+ - type: map_at_3
1442
+ value: 8.471
1443
+ - type: map_at_5
1444
+ value: 10.128
1445
+ - type: ndcg_at_1
1446
+ value: 40.402
1447
+ - type: ndcg_at_10
1448
+ value: 33.289
1449
+ - type: ndcg_at_100
1450
+ value: 31.496000000000002
1451
+ - type: ndcg_at_1000
1452
+ value: 40.453
1453
+ - type: ndcg_at_3
1454
+ value: 37.841
1455
+ - type: ndcg_at_5
1456
+ value: 36.215
1457
+ - type: precision_at_1
1458
+ value: 41.796
1459
+ - type: precision_at_10
1460
+ value: 25.294
1461
+ - type: precision_at_100
1462
+ value: 8.381
1463
+ - type: precision_at_1000
1464
+ value: 2.1260000000000003
1465
+ - type: precision_at_3
1466
+ value: 36.429
1467
+ - type: precision_at_5
1468
+ value: 32.446000000000005
1469
+ - type: recall_at_1
1470
+ value: 4.938
1471
+ - type: recall_at_10
1472
+ value: 16.637
1473
+ - type: recall_at_100
1474
+ value: 33.853
1475
+ - type: recall_at_1000
1476
+ value: 66.07
1477
+ - type: recall_at_3
1478
+ value: 9.818
1479
+ - type: recall_at_5
1480
+ value: 12.544
1481
+ - task:
1482
+ type: Retrieval
1483
+ dataset:
1484
+ type: nq
1485
+ name: MTEB NQ
1486
+ config: default
1487
+ split: test
1488
+ revision: 6062aefc120bfe8ece5897809fb2e53bfe0d128c
1489
+ metrics:
1490
+ - type: map_at_1
1491
+ value: 27.124
1492
+ - type: map_at_10
1493
+ value: 42.418
1494
+ - type: map_at_100
1495
+ value: 43.633
1496
+ - type: map_at_1000
1497
+ value: 43.66
1498
+ - type: map_at_3
1499
+ value: 37.766
1500
+ - type: map_at_5
1501
+ value: 40.482
1502
+ - type: ndcg_at_1
1503
+ value: 30.794
1504
+ - type: ndcg_at_10
1505
+ value: 50.449999999999996
1506
+ - type: ndcg_at_100
1507
+ value: 55.437999999999995
1508
+ - type: ndcg_at_1000
1509
+ value: 56.084
1510
+ - type: ndcg_at_3
1511
+ value: 41.678
1512
+ - type: ndcg_at_5
1513
+ value: 46.257
1514
+ - type: precision_at_1
1515
+ value: 30.794
1516
+ - type: precision_at_10
1517
+ value: 8.656
1518
+ - type: precision_at_100
1519
+ value: 1.141
1520
+ - type: precision_at_1000
1521
+ value: 0.12
1522
+ - type: precision_at_3
1523
+ value: 19.37
1524
+ - type: precision_at_5
1525
+ value: 14.218
1526
+ - type: recall_at_1
1527
+ value: 27.124
1528
+ - type: recall_at_10
1529
+ value: 72.545
1530
+ - type: recall_at_100
1531
+ value: 93.938
1532
+ - type: recall_at_1000
1533
+ value: 98.788
1534
+ - type: recall_at_3
1535
+ value: 49.802
1536
+ - type: recall_at_5
1537
+ value: 60.426
1538
+ - task:
1539
+ type: Retrieval
1540
+ dataset:
1541
+ type: quora
1542
+ name: MTEB QuoraRetrieval
1543
+ config: default
1544
+ split: test
1545
+ revision: 6205996560df11e3a3da9ab4f926788fc30a7db4
1546
+ metrics:
1547
+ - type: map_at_1
1548
+ value: 69.33500000000001
1549
+ - type: map_at_10
1550
+ value: 83.554
1551
+ - type: map_at_100
1552
+ value: 84.237
1553
+ - type: map_at_1000
1554
+ value: 84.251
1555
+ - type: map_at_3
1556
+ value: 80.456
1557
+ - type: map_at_5
1558
+ value: 82.395
1559
+ - type: ndcg_at_1
1560
+ value: 80.06
1561
+ - type: ndcg_at_10
1562
+ value: 87.46199999999999
1563
+ - type: ndcg_at_100
1564
+ value: 88.774
1565
+ - type: ndcg_at_1000
1566
+ value: 88.864
1567
+ - type: ndcg_at_3
1568
+ value: 84.437
1569
+ - type: ndcg_at_5
1570
+ value: 86.129
1571
+ - type: precision_at_1
1572
+ value: 80.06
1573
+ - type: precision_at_10
1574
+ value: 13.418
1575
+ - type: precision_at_100
1576
+ value: 1.536
1577
+ - type: precision_at_1000
1578
+ value: 0.157
1579
+ - type: precision_at_3
1580
+ value: 37.103
1581
+ - type: precision_at_5
1582
+ value: 24.522
1583
+ - type: recall_at_1
1584
+ value: 69.33500000000001
1585
+ - type: recall_at_10
1586
+ value: 95.03200000000001
1587
+ - type: recall_at_100
1588
+ value: 99.559
1589
+ - type: recall_at_1000
1590
+ value: 99.98700000000001
1591
+ - type: recall_at_3
1592
+ value: 86.404
1593
+ - type: recall_at_5
1594
+ value: 91.12400000000001
1595
+ - task:
1596
+ type: Clustering
1597
+ dataset:
1598
+ type: mteb/reddit-clustering
1599
+ name: MTEB RedditClustering
1600
+ config: default
1601
+ split: test
1602
+ revision: b2805658ae38990172679479369a78b86de8c390
1603
+ metrics:
1604
+ - type: v_measure
1605
+ value: 54.824256698437324
1606
+ - task:
1607
+ type: Clustering
1608
+ dataset:
1609
+ type: mteb/reddit-clustering-p2p
1610
+ name: MTEB RedditClusteringP2P
1611
+ config: default
1612
+ split: test
1613
+ revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
1614
+ metrics:
1615
+ - type: v_measure
1616
+ value: 56.768972678049366
1617
+ - task:
1618
+ type: Retrieval
1619
+ dataset:
1620
+ type: scidocs
1621
+ name: MTEB SCIDOCS
1622
+ config: default
1623
+ split: test
1624
+ revision: 5c59ef3e437a0a9651c8fe6fde943e7dce59fba5
1625
+ metrics:
1626
+ - type: map_at_1
1627
+ value: 5.192
1628
+ - type: map_at_10
1629
+ value: 14.426
1630
+ - type: map_at_100
1631
+ value: 17.18
1632
+ - type: map_at_1000
1633
+ value: 17.580000000000002
1634
+ - type: map_at_3
1635
+ value: 9.94
1636
+ - type: map_at_5
1637
+ value: 12.077
1638
+ - type: ndcg_at_1
1639
+ value: 25.5
1640
+ - type: ndcg_at_10
1641
+ value: 23.765
1642
+ - type: ndcg_at_100
1643
+ value: 33.664
1644
+ - type: ndcg_at_1000
1645
+ value: 39.481
1646
+ - type: ndcg_at_3
1647
+ value: 21.813
1648
+ - type: ndcg_at_5
1649
+ value: 19.285
1650
+ - type: precision_at_1
1651
+ value: 25.5
1652
+ - type: precision_at_10
1653
+ value: 12.690000000000001
1654
+ - type: precision_at_100
1655
+ value: 2.71
1656
+ - type: precision_at_1000
1657
+ value: 0.409
1658
+ - type: precision_at_3
1659
+ value: 20.732999999999997
1660
+ - type: precision_at_5
1661
+ value: 17.24
1662
+ - type: recall_at_1
1663
+ value: 5.192
1664
+ - type: recall_at_10
1665
+ value: 25.712000000000003
1666
+ - type: recall_at_100
1667
+ value: 54.99699999999999
1668
+ - type: recall_at_1000
1669
+ value: 82.97200000000001
1670
+ - type: recall_at_3
1671
+ value: 12.631999999999998
1672
+ - type: recall_at_5
1673
+ value: 17.497
1674
+ - task:
1675
+ type: STS
1676
+ dataset:
1677
+ type: mteb/sickr-sts
1678
+ name: MTEB SICK-R
1679
+ config: default
1680
+ split: test
1681
+ revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
1682
+ metrics:
1683
+ - type: cos_sim_pearson
1684
+ value: 84.00280838354293
1685
+ - type: cos_sim_spearman
1686
+ value: 80.5854192844009
1687
+ - type: euclidean_pearson
1688
+ value: 80.55974827073891
1689
+ - type: euclidean_spearman
1690
+ value: 80.58541460172292
1691
+ - type: manhattan_pearson
1692
+ value: 80.27294578437488
1693
+ - type: manhattan_spearman
1694
+ value: 80.33176193921884
1695
+ - task:
1696
+ type: STS
1697
+ dataset:
1698
+ type: mteb/sts12-sts
1699
+ name: MTEB STS12
1700
+ config: default
1701
+ split: test
1702
+ revision: fdf84275bb8ce4b49c971d02e84dd1abc677a50f
1703
+ metrics:
1704
+ - type: cos_sim_pearson
1705
+ value: 83.2801353818369
1706
+ - type: cos_sim_spearman
1707
+ value: 72.63427853822449
1708
+ - type: euclidean_pearson
1709
+ value: 79.01343235899544
1710
+ - type: euclidean_spearman
1711
+ value: 72.63178302036903
1712
+ - type: manhattan_pearson
1713
+ value: 78.65899981586094
1714
+ - type: manhattan_spearman
1715
+ value: 72.26646573268035
1716
+ - task:
1717
+ type: STS
1718
+ dataset:
1719
+ type: mteb/sts13-sts
1720
+ name: MTEB STS13
1721
+ config: default
1722
+ split: test
1723
+ revision: 1591bfcbe8c69d4bf7fe2a16e2451017832cafb9
1724
+ metrics:
1725
+ - type: cos_sim_pearson
1726
+ value: 83.20700572036095
1727
+ - type: cos_sim_spearman
1728
+ value: 83.48499016384896
1729
+ - type: euclidean_pearson
1730
+ value: 82.82555353364394
1731
+ - type: euclidean_spearman
1732
+ value: 83.48499008964005
1733
+ - type: manhattan_pearson
1734
+ value: 82.46034885462956
1735
+ - type: manhattan_spearman
1736
+ value: 83.09829447251937
1737
+ - task:
1738
+ type: STS
1739
+ dataset:
1740
+ type: mteb/sts14-sts
1741
+ name: MTEB STS14
1742
+ config: default
1743
+ split: test
1744
+ revision: e2125984e7df8b7871f6ae9949cf6b6795e7c54b
1745
+ metrics:
1746
+ - type: cos_sim_pearson
1747
+ value: 82.27113025749529
1748
+ - type: cos_sim_spearman
1749
+ value: 78.0001371342168
1750
+ - type: euclidean_pearson
1751
+ value: 80.62651938409732
1752
+ - type: euclidean_spearman
1753
+ value: 78.0001341029446
1754
+ - type: manhattan_pearson
1755
+ value: 80.25786381999085
1756
+ - type: manhattan_spearman
1757
+ value: 77.68750207429126
1758
+ - task:
1759
+ type: STS
1760
+ dataset:
1761
+ type: mteb/sts15-sts
1762
+ name: MTEB STS15
1763
+ config: default
1764
+ split: test
1765
+ revision: 1cd7298cac12a96a373b6a2f18738bb3e739a9b6
1766
+ metrics:
1767
+ - type: cos_sim_pearson
1768
+ value: 84.98824030948605
1769
+ - type: cos_sim_spearman
1770
+ value: 85.66275391649481
1771
+ - type: euclidean_pearson
1772
+ value: 84.88733530073506
1773
+ - type: euclidean_spearman
1774
+ value: 85.66275062257034
1775
+ - type: manhattan_pearson
1776
+ value: 84.70100813924223
1777
+ - type: manhattan_spearman
1778
+ value: 85.50318526944764
1779
+ - task:
1780
+ type: STS
1781
+ dataset:
1782
+ type: mteb/sts16-sts
1783
+ name: MTEB STS16
1784
+ config: default
1785
+ split: test
1786
+ revision: 360a0b2dff98700d09e634a01e1cc1624d3e42cd
1787
+ metrics:
1788
+ - type: cos_sim_pearson
1789
+ value: 78.82478639193744
1790
+ - type: cos_sim_spearman
1791
+ value: 80.03011315645662
1792
+ - type: euclidean_pearson
1793
+ value: 79.84794502236802
1794
+ - type: euclidean_spearman
1795
+ value: 80.03011258077692
1796
+ - type: manhattan_pearson
1797
+ value: 79.47012152325492
1798
+ - type: manhattan_spearman
1799
+ value: 79.60652985087651
1800
+ - task:
1801
+ type: STS
1802
+ dataset:
1803
+ type: mteb/sts17-crosslingual-sts
1804
+ name: MTEB STS17 (en-en)
1805
+ config: en-en
1806
+ split: test
1807
+ revision: 9fc37e8c632af1c87a3d23e685d49552a02582a0
1808
+ metrics:
1809
+ - type: cos_sim_pearson
1810
+ value: 90.90804154377126
1811
+ - type: cos_sim_spearman
1812
+ value: 90.59523263123734
1813
+ - type: euclidean_pearson
1814
+ value: 89.8466957775513
1815
+ - type: euclidean_spearman
1816
+ value: 90.59523263123734
1817
+ - type: manhattan_pearson
1818
+ value: 89.82268413033941
1819
+ - type: manhattan_spearman
1820
+ value: 90.68706496728889
1821
+ - task:
1822
+ type: STS
1823
+ dataset:
1824
+ type: mteb/sts22-crosslingual-sts
1825
+ name: MTEB STS22 (en)
1826
+ config: en
1827
+ split: test
1828
+ revision: 2de6ce8c1921b71a755b262c6b57fef195dd7906
1829
+ metrics:
1830
+ - type: cos_sim_pearson
1831
+ value: 66.78771571400975
1832
+ - type: cos_sim_spearman
1833
+ value: 67.94534221542501
1834
+ - type: euclidean_pearson
1835
+ value: 68.62534447097993
1836
+ - type: euclidean_spearman
1837
+ value: 67.94534221542501
1838
+ - type: manhattan_pearson
1839
+ value: 68.35916011329631
1840
+ - type: manhattan_spearman
1841
+ value: 67.58212723406085
1842
+ - task:
1843
+ type: STS
1844
+ dataset:
1845
+ type: mteb/stsbenchmark-sts
1846
+ name: MTEB STSBenchmark
1847
+ config: default
1848
+ split: test
1849
+ revision: 8913289635987208e6e7c72789e4be2fe94b6abd
1850
+ metrics:
1851
+ - type: cos_sim_pearson
1852
+ value: 84.03996099800993
1853
+ - type: cos_sim_spearman
1854
+ value: 83.421898505618
1855
+ - type: euclidean_pearson
1856
+ value: 83.78671249317563
1857
+ - type: euclidean_spearman
1858
+ value: 83.4219042133061
1859
+ - type: manhattan_pearson
1860
+ value: 83.44085827249334
1861
+ - type: manhattan_spearman
1862
+ value: 83.02901331535297
1863
+ - task:
1864
+ type: Reranking
1865
+ dataset:
1866
+ type: mteb/scidocs-reranking
1867
+ name: MTEB SciDocsRR
1868
+ config: default
1869
+ split: test
1870
+ revision: 56a6d0140cf6356659e2a7c1413286a774468d44
1871
+ metrics:
1872
+ - type: map
1873
+ value: 88.65396986895777
1874
+ - type: mrr
1875
+ value: 96.60209525405604
1876
+ - task:
1877
+ type: Retrieval
1878
+ dataset:
1879
+ type: scifact
1880
+ name: MTEB SciFact
1881
+ config: default
1882
+ split: test
1883
+ revision: a75ae049398addde9b70f6b268875f5cbce99089
1884
+ metrics:
1885
+ - type: map_at_1
1886
+ value: 51.456
1887
+ - type: map_at_10
1888
+ value: 60.827
1889
+ - type: map_at_100
1890
+ value: 61.595
1891
+ - type: map_at_1000
1892
+ value: 61.629999999999995
1893
+ - type: map_at_3
1894
+ value: 57.518
1895
+ - type: map_at_5
1896
+ value: 59.435
1897
+ - type: ndcg_at_1
1898
+ value: 53.333
1899
+ - type: ndcg_at_10
1900
+ value: 65.57
1901
+ - type: ndcg_at_100
1902
+ value: 68.911
1903
+ - type: ndcg_at_1000
1904
+ value: 69.65299999999999
1905
+ - type: ndcg_at_3
1906
+ value: 60.009
1907
+ - type: ndcg_at_5
1908
+ value: 62.803
1909
+ - type: precision_at_1
1910
+ value: 53.333
1911
+ - type: precision_at_10
1912
+ value: 8.933
1913
+ - type: precision_at_100
1914
+ value: 1.0699999999999998
1915
+ - type: precision_at_1000
1916
+ value: 0.11299999999999999
1917
+ - type: precision_at_3
1918
+ value: 23.333000000000002
1919
+ - type: precision_at_5
1920
+ value: 15.8
1921
+ - type: recall_at_1
1922
+ value: 51.456
1923
+ - type: recall_at_10
1924
+ value: 79.011
1925
+ - type: recall_at_100
1926
+ value: 94.167
1927
+ - type: recall_at_1000
1928
+ value: 99.667
1929
+ - type: recall_at_3
1930
+ value: 64.506
1931
+ - type: recall_at_5
1932
+ value: 71.211
1933
+ - task:
1934
+ type: PairClassification
1935
+ dataset:
1936
+ type: mteb/sprintduplicatequestions-pairclassification
1937
+ name: MTEB SprintDuplicateQuestions
1938
+ config: default
1939
+ split: test
1940
+ revision: 5a8256d0dff9c4bd3be3ba3e67e4e70173f802ea
1941
+ metrics:
1942
+ - type: cos_sim_accuracy
1943
+ value: 99.65940594059406
1944
+ - type: cos_sim_ap
1945
+ value: 90.1455141683116
1946
+ - type: cos_sim_f1
1947
+ value: 82.26044226044226
1948
+ - type: cos_sim_precision
1949
+ value: 80.8695652173913
1950
+ - type: cos_sim_recall
1951
+ value: 83.7
1952
+ - type: dot_accuracy
1953
+ value: 99.65940594059406
1954
+ - type: dot_ap
1955
+ value: 90.1455141683116
1956
+ - type: dot_f1
1957
+ value: 82.26044226044226
1958
+ - type: dot_precision
1959
+ value: 80.8695652173913
1960
+ - type: dot_recall
1961
+ value: 83.7
1962
+ - type: euclidean_accuracy
1963
+ value: 99.65940594059406
1964
+ - type: euclidean_ap
1965
+ value: 90.14551416831162
1966
+ - type: euclidean_f1
1967
+ value: 82.26044226044226
1968
+ - type: euclidean_precision
1969
+ value: 80.8695652173913
1970
+ - type: euclidean_recall
1971
+ value: 83.7
1972
+ - type: manhattan_accuracy
1973
+ value: 99.64950495049504
1974
+ - type: manhattan_ap
1975
+ value: 89.5492617367771
1976
+ - type: manhattan_f1
1977
+ value: 81.58280410356619
1978
+ - type: manhattan_precision
1979
+ value: 79.75167144221585
1980
+ - type: manhattan_recall
1981
+ value: 83.5
1982
+ - type: max_accuracy
1983
+ value: 99.65940594059406
1984
+ - type: max_ap
1985
+ value: 90.14551416831162
1986
+ - type: max_f1
1987
+ value: 82.26044226044226
1988
+ - task:
1989
+ type: Clustering
1990
+ dataset:
1991
+ type: mteb/stackexchange-clustering
1992
+ name: MTEB StackExchangeClustering
1993
+ config: default
1994
+ split: test
1995
+ revision: 70a89468f6dccacc6aa2b12a6eac54e74328f235
1996
+ metrics:
1997
+ - type: v_measure
1998
+ value: 53.80048409076929
1999
+ - task:
2000
+ type: Clustering
2001
+ dataset:
2002
+ type: mteb/stackexchange-clustering-p2p
2003
+ name: MTEB StackExchangeClusteringP2P
2004
+ config: default
2005
+ split: test
2006
+ revision: d88009ab563dd0b16cfaf4436abaf97fa3550cf0
2007
+ metrics:
2008
+ - type: v_measure
2009
+ value: 34.280269334397545
2010
+ - task:
2011
+ type: Reranking
2012
+ dataset:
2013
+ type: mteb/stackoverflowdupquestions-reranking
2014
+ name: MTEB StackOverflowDupQuestions
2015
+ config: default
2016
+ split: test
2017
+ revision: ef807ea29a75ec4f91b50fd4191cb4ee4589a9f9
2018
+ metrics:
2019
+ - type: map
2020
+ value: 51.97907654945493
2021
+ - type: mrr
2022
+ value: 52.82873376623376
2023
+ - task:
2024
+ type: Summarization
2025
+ dataset:
2026
+ type: mteb/summeval
2027
+ name: MTEB SummEval
2028
+ config: default
2029
+ split: test
2030
+ revision: 8753c2788d36c01fc6f05d03fe3f7268d63f9122
2031
+ metrics:
2032
+ - type: cos_sim_pearson
2033
+ value: 28.364293841556304
2034
+ - type: cos_sim_spearman
2035
+ value: 27.485869639926136
2036
+ - type: dot_pearson
2037
+ value: 28.364295910221145
2038
+ - type: dot_spearman
2039
+ value: 27.485869639926136
2040
+ - task:
2041
+ type: Retrieval
2042
+ dataset:
2043
+ type: trec-covid
2044
+ name: MTEB TRECCOVID
2045
+ config: default
2046
+ split: test
2047
+ revision: 2c8041b2c07a79b6f7ba8fe6acc72e5d9f92d217
2048
+ metrics:
2049
+ - type: map_at_1
2050
+ value: 0.19499999999999998
2051
+ - type: map_at_10
2052
+ value: 1.218
2053
+ - type: map_at_100
2054
+ value: 7.061000000000001
2055
+ - type: map_at_1000
2056
+ value: 19.735
2057
+ - type: map_at_3
2058
+ value: 0.46499999999999997
2059
+ - type: map_at_5
2060
+ value: 0.672
2061
+ - type: ndcg_at_1
2062
+ value: 60.0
2063
+ - type: ndcg_at_10
2064
+ value: 51.32600000000001
2065
+ - type: ndcg_at_100
2066
+ value: 41.74
2067
+ - type: ndcg_at_1000
2068
+ value: 43.221
2069
+ - type: ndcg_at_3
2070
+ value: 54.989
2071
+ - type: ndcg_at_5
2072
+ value: 52.905
2073
+ - type: precision_at_1
2074
+ value: 66.0
2075
+ - type: precision_at_10
2076
+ value: 55.60000000000001
2077
+ - type: precision_at_100
2078
+ value: 43.34
2079
+ - type: precision_at_1000
2080
+ value: 19.994
2081
+ - type: precision_at_3
2082
+ value: 59.333000000000006
2083
+ - type: precision_at_5
2084
+ value: 57.199999999999996
2085
+ - type: recall_at_1
2086
+ value: 0.19499999999999998
2087
+ - type: recall_at_10
2088
+ value: 1.473
2089
+ - type: recall_at_100
2090
+ value: 10.596
2091
+ - type: recall_at_1000
2092
+ value: 42.466
2093
+ - type: recall_at_3
2094
+ value: 0.49899999999999994
2095
+ - type: recall_at_5
2096
+ value: 0.76
2097
+ - task:
2098
+ type: Retrieval
2099
+ dataset:
2100
+ type: webis-touche2020
2101
+ name: MTEB Touche2020
2102
+ config: default
2103
+ split: test
2104
+ revision: 527b7d77e16e343303e68cb6af11d6e18b9f7b3b
2105
+ metrics:
2106
+ - type: map_at_1
2107
+ value: 1.997
2108
+ - type: map_at_10
2109
+ value: 7.5569999999999995
2110
+ - type: map_at_100
2111
+ value: 12.238
2112
+ - type: map_at_1000
2113
+ value: 13.773
2114
+ - type: map_at_3
2115
+ value: 4.334
2116
+ - type: map_at_5
2117
+ value: 5.5
2118
+ - type: ndcg_at_1
2119
+ value: 22.448999999999998
2120
+ - type: ndcg_at_10
2121
+ value: 19.933999999999997
2122
+ - type: ndcg_at_100
2123
+ value: 30.525999999999996
2124
+ - type: ndcg_at_1000
2125
+ value: 43.147999999999996
2126
+ - type: ndcg_at_3
2127
+ value: 22.283
2128
+ - type: ndcg_at_5
2129
+ value: 21.224
2130
+ - type: precision_at_1
2131
+ value: 24.490000000000002
2132
+ - type: precision_at_10
2133
+ value: 17.551
2134
+ - type: precision_at_100
2135
+ value: 6.4079999999999995
2136
+ - type: precision_at_1000
2137
+ value: 1.463
2138
+ - type: precision_at_3
2139
+ value: 23.128999999999998
2140
+ - type: precision_at_5
2141
+ value: 20.816000000000003
2142
+ - type: recall_at_1
2143
+ value: 1.997
2144
+ - type: recall_at_10
2145
+ value: 13.001999999999999
2146
+ - type: recall_at_100
2147
+ value: 40.98
2148
+ - type: recall_at_1000
2149
+ value: 79.40899999999999
2150
+ - type: recall_at_3
2151
+ value: 5.380999999999999
2152
+ - type: recall_at_5
2153
+ value: 7.721
2154
+ - task:
2155
+ type: Classification
2156
+ dataset:
2157
+ type: mteb/toxic_conversations_50k
2158
+ name: MTEB ToxicConversationsClassification
2159
+ config: default
2160
+ split: test
2161
+ revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
2162
+ metrics:
2163
+ - type: accuracy
2164
+ value: 60.861200000000004
2165
+ - type: ap
2166
+ value: 11.39641747026629
2167
+ - type: f1
2168
+ value: 47.80230380517065
2169
+ - task:
2170
+ type: Classification
2171
+ dataset:
2172
+ type: mteb/tweet_sentiment_extraction
2173
+ name: MTEB TweetSentimentExtractionClassification
2174
+ config: default
2175
+ split: test
2176
+ revision: 62146448f05be9e52a36b8ee9936447ea787eede
2177
+ metrics:
2178
+ - type: accuracy
2179
+ value: 55.464063384267114
2180
+ - type: f1
2181
+ value: 55.759039643764666
2182
+ - task:
2183
+ type: Clustering
2184
+ dataset:
2185
+ type: mteb/twentynewsgroups-clustering
2186
+ name: MTEB TwentyNewsgroupsClustering
2187
+ config: default
2188
+ split: test
2189
+ revision: 091a54f9a36281ce7d6590ec8c75dd485e7e01d4
2190
+ metrics:
2191
+ - type: v_measure
2192
+ value: 49.74455348083809
2193
+ - task:
2194
+ type: PairClassification
2195
+ dataset:
2196
+ type: mteb/twittersemeval2015-pairclassification
2197
+ name: MTEB TwitterSemEval2015
2198
+ config: default
2199
+ split: test
2200
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2201
+ metrics:
2202
+ - type: cos_sim_accuracy
2203
+ value: 86.07617571675507
2204
+ - type: cos_sim_ap
2205
+ value: 73.85398650568216
2206
+ - type: cos_sim_f1
2207
+ value: 68.50702798531087
2208
+ - type: cos_sim_precision
2209
+ value: 65.86316045775506
2210
+ - type: cos_sim_recall
2211
+ value: 71.37203166226914
2212
+ - type: dot_accuracy
2213
+ value: 86.07617571675507
2214
+ - type: dot_ap
2215
+ value: 73.85398346238429
2216
+ - type: dot_f1
2217
+ value: 68.50702798531087
2218
+ - type: dot_precision
2219
+ value: 65.86316045775506
2220
+ - type: dot_recall
2221
+ value: 71.37203166226914
2222
+ - type: euclidean_accuracy
2223
+ value: 86.07617571675507
2224
+ - type: euclidean_ap
2225
+ value: 73.85398625060357
2226
+ - type: euclidean_f1
2227
+ value: 68.50702798531087
2228
+ - type: euclidean_precision
2229
+ value: 65.86316045775506
2230
+ - type: euclidean_recall
2231
+ value: 71.37203166226914
2232
+ - type: manhattan_accuracy
2233
+ value: 85.98676759849795
2234
+ - type: manhattan_ap
2235
+ value: 73.86874126878737
2236
+ - type: manhattan_f1
2237
+ value: 68.55096559662361
2238
+ - type: manhattan_precision
2239
+ value: 66.51774633904195
2240
+ - type: manhattan_recall
2241
+ value: 70.71240105540898
2242
+ - type: max_accuracy
2243
+ value: 86.07617571675507
2244
+ - type: max_ap
2245
+ value: 73.86874126878737
2246
+ - type: max_f1
2247
+ value: 68.55096559662361
2248
+ - task:
2249
+ type: PairClassification
2250
+ dataset:
2251
+ type: mteb/twitterurlcorpus-pairclassification
2252
+ name: MTEB TwitterURLCorpus
2253
+ config: default
2254
+ split: test
2255
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2256
+ metrics:
2257
+ - type: cos_sim_accuracy
2258
+ value: 88.51631932316529
2259
+ - type: cos_sim_ap
2260
+ value: 85.10831084479727
2261
+ - type: cos_sim_f1
2262
+ value: 77.14563397129186
2263
+ - type: cos_sim_precision
2264
+ value: 74.9709386806161
2265
+ - type: cos_sim_recall
2266
+ value: 79.45026178010471
2267
+ - type: dot_accuracy
2268
+ value: 88.51631932316529
2269
+ - type: dot_ap
2270
+ value: 85.10831188797107
2271
+ - type: dot_f1
2272
+ value: 77.14563397129186
2273
+ - type: dot_precision
2274
+ value: 74.9709386806161
2275
+ - type: dot_recall
2276
+ value: 79.45026178010471
2277
+ - type: euclidean_accuracy
2278
+ value: 88.51631932316529
2279
+ - type: euclidean_ap
2280
+ value: 85.10829618408616
2281
+ - type: euclidean_f1
2282
+ value: 77.14563397129186
2283
+ - type: euclidean_precision
2284
+ value: 74.9709386806161
2285
+ - type: euclidean_recall
2286
+ value: 79.45026178010471
2287
+ - type: manhattan_accuracy
2288
+ value: 88.50467652423643
2289
+ - type: manhattan_ap
2290
+ value: 85.08329502055064
2291
+ - type: manhattan_f1
2292
+ value: 77.11157455683002
2293
+ - type: manhattan_precision
2294
+ value: 74.67541834968263
2295
+ - type: manhattan_recall
2296
+ value: 79.71204188481676
2297
+ - type: max_accuracy
2298
+ value: 88.51631932316529
2299
+ - type: max_ap
2300
+ value: 85.10831188797107
2301
+ - type: max_f1
2302
+ value: 77.14563397129186
2303
  ---
2304
 
2305
 
 
2364
 
2365
  ## Evaluation Results
2366
 
2367
+ For an automated evaluation of this model, see *MTEB*: https://huggingface.co/spaces/mteb/leaderboard or the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name=sentence-transformers/all-MiniLM-L12-v2)
2368
 
2369
  ------
2370