Muennighoff commited on
Commit
a916fb7
1 Parent(s): cf85fcd

Add MTEB meta

Browse files
Files changed (1) hide show
  1. README.md +2519 -3
README.md CHANGED
@@ -8,13 +8,2529 @@ model-index:
8
  - name: SGPT-5.8B-weightedmean-msmarco-specb-bitfit
9
  results:
10
  - task:
11
- type: text-classification
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
12
  dataset:
13
  type: mteb/banking77
14
- name: MTEB Banking77
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
15
  metrics:
16
  - type: accuracy
17
- value: 0.8449350649350649
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
18
  ---
19
 
20
  # SGPT-5.8B-weightedmean-msmarco-specb-bitfit
8
  - name: SGPT-5.8B-weightedmean-msmarco-specb-bitfit
9
  results:
10
  - task:
11
+ type: Classification
12
+ dataset:
13
+ type: mteb/amazon_counterfactual
14
+ name: MTEB AmazonCounterfactualClassification (en)
15
+ config: en
16
+ split: test
17
+ metrics:
18
+ - type: accuracy
19
+ value: 69.22388059701493
20
+ - type: ap
21
+ value: 32.04724673950256
22
+ - type: f1
23
+ value: 63.25719825770428
24
+ - task:
25
+ type: Classification
26
+ dataset:
27
+ type: mteb/amazon_polarity
28
+ name: MTEB AmazonPolarityClassification
29
+ config: default
30
+ split: test
31
+ metrics:
32
+ - type: accuracy
33
+ value: 71.26109999999998
34
+ - type: ap
35
+ value: 66.16336378255403
36
+ - type: f1
37
+ value: 70.89719145825303
38
+ - task:
39
+ type: Classification
40
+ dataset:
41
+ type: mteb/amazon_reviews_multi
42
+ name: MTEB AmazonReviewsClassification (en)
43
+ config: en
44
+ split: test
45
+ metrics:
46
+ - type: accuracy
47
+ value: 39.19199999999999
48
+ - type: f1
49
+ value: 38.580766731113826
50
+ - task:
51
+ type: Retrieval
52
+ dataset:
53
+ type: arguana
54
+ name: MTEB ArguAna
55
+ config: default
56
+ split: test
57
+ metrics:
58
+ - type: map_at_1
59
+ value: 27.311999999999998
60
+ - type: map_at_10
61
+ value: 42.620000000000005
62
+ - type: map_at_100
63
+ value: 43.707
64
+ - type: map_at_1000
65
+ value: 43.714999999999996
66
+ - type: map_at_3
67
+ value: 37.624
68
+ - type: map_at_5
69
+ value: 40.498
70
+ - type: mrr_at_1
71
+ value: 27.667
72
+ - type: mrr_at_10
73
+ value: 42.737
74
+ - type: mrr_at_100
75
+ value: 43.823
76
+ - type: mrr_at_1000
77
+ value: 43.830999999999996
78
+ - type: mrr_at_3
79
+ value: 37.743
80
+ - type: mrr_at_5
81
+ value: 40.616
82
+ - type: ndcg_at_1
83
+ value: 27.311999999999998
84
+ - type: ndcg_at_10
85
+ value: 51.37500000000001
86
+ - type: ndcg_at_100
87
+ value: 55.778000000000006
88
+ - type: ndcg_at_1000
89
+ value: 55.96600000000001
90
+ - type: ndcg_at_3
91
+ value: 41.087
92
+ - type: ndcg_at_5
93
+ value: 46.269
94
+ - type: precision_at_1
95
+ value: 27.311999999999998
96
+ - type: precision_at_10
97
+ value: 7.945
98
+ - type: precision_at_100
99
+ value: 0.9820000000000001
100
+ - type: precision_at_1000
101
+ value: 0.1
102
+ - type: precision_at_3
103
+ value: 17.046
104
+ - type: precision_at_5
105
+ value: 12.745000000000001
106
+ - type: recall_at_1
107
+ value: 27.311999999999998
108
+ - type: recall_at_10
109
+ value: 79.445
110
+ - type: recall_at_100
111
+ value: 98.151
112
+ - type: recall_at_1000
113
+ value: 99.57300000000001
114
+ - type: recall_at_3
115
+ value: 51.13799999999999
116
+ - type: recall_at_5
117
+ value: 63.727000000000004
118
+ - task:
119
+ type: Clustering
120
+ dataset:
121
+ type: mteb/arxiv-clustering-p2p
122
+ name: MTEB ArxivClusteringP2P
123
+ config: default
124
+ split: test
125
+ metrics:
126
+ - type: v_measure
127
+ value: 45.59037428592033
128
+ - task:
129
+ type: Clustering
130
+ dataset:
131
+ type: mteb/arxiv-clustering-s2s
132
+ name: MTEB ArxivClusteringS2S
133
+ config: default
134
+ split: test
135
+ metrics:
136
+ - type: v_measure
137
+ value: 38.86371701986363
138
+ - task:
139
+ type: Reranking
140
+ dataset:
141
+ type: mteb/askubuntudupquestions-reranking
142
+ name: MTEB AskUbuntuDupQuestions
143
+ config: default
144
+ split: test
145
+ metrics:
146
+ - type: map
147
+ value: 61.625568691427766
148
+ - type: mrr
149
+ value: 75.83256386580486
150
+ - task:
151
+ type: STS
152
+ dataset:
153
+ type: mteb/biosses-sts
154
+ name: MTEB BIOSSES
155
+ config: default
156
+ split: test
157
+ metrics:
158
+ - type: cos_sim_pearson
159
+ value: 89.96074355094802
160
+ - type: cos_sim_spearman
161
+ value: 86.2501580394454
162
+ - type: euclidean_pearson
163
+ value: 82.18427440380462
164
+ - type: euclidean_spearman
165
+ value: 80.14760935017947
166
+ - type: manhattan_pearson
167
+ value: 82.24621578156392
168
+ - type: manhattan_spearman
169
+ value: 80.00363016590163
170
+ - task:
171
+ type: Classification
172
  dataset:
173
  type: mteb/banking77
174
+ name: MTEB Banking77Classification
175
+ config: default
176
+ split: test
177
+ metrics:
178
+ - type: accuracy
179
+ value: 84.49350649350649
180
+ - type: f1
181
+ value: 84.4249343233736
182
+ - task:
183
+ type: Clustering
184
+ dataset:
185
+ type: mteb/biorxiv-clustering-p2p
186
+ name: MTEB BiorxivClusteringP2P
187
+ config: default
188
+ split: test
189
+ metrics:
190
+ - type: v_measure
191
+ value: 36.551459722989385
192
+ - task:
193
+ type: Clustering
194
+ dataset:
195
+ type: mteb/biorxiv-clustering-s2s
196
+ name: MTEB BiorxivClusteringS2S
197
+ config: default
198
+ split: test
199
+ metrics:
200
+ - type: v_measure
201
+ value: 33.69901851846774
202
+ - task:
203
+ type: Retrieval
204
+ dataset:
205
+ type: BeIR/cqadupstack
206
+ name: MTEB CQADupstackAndroidRetrieval
207
+ config: default
208
+ split: test
209
+ metrics:
210
+ - type: map_at_1
211
+ value: 30.499
212
+ - type: map_at_10
213
+ value: 41.208
214
+ - type: map_at_100
215
+ value: 42.638
216
+ - type: map_at_1000
217
+ value: 42.754
218
+ - type: map_at_3
219
+ value: 37.506
220
+ - type: map_at_5
221
+ value: 39.422000000000004
222
+ - type: mrr_at_1
223
+ value: 37.339
224
+ - type: mrr_at_10
225
+ value: 47.051
226
+ - type: mrr_at_100
227
+ value: 47.745
228
+ - type: mrr_at_1000
229
+ value: 47.786
230
+ - type: mrr_at_3
231
+ value: 44.086999999999996
232
+ - type: mrr_at_5
233
+ value: 45.711
234
+ - type: ndcg_at_1
235
+ value: 37.339
236
+ - type: ndcg_at_10
237
+ value: 47.666
238
+ - type: ndcg_at_100
239
+ value: 52.994
240
+ - type: ndcg_at_1000
241
+ value: 54.928999999999995
242
+ - type: ndcg_at_3
243
+ value: 41.982
244
+ - type: ndcg_at_5
245
+ value: 44.42
246
+ - type: precision_at_1
247
+ value: 37.339
248
+ - type: precision_at_10
249
+ value: 9.127
250
+ - type: precision_at_100
251
+ value: 1.4749999999999999
252
+ - type: precision_at_1000
253
+ value: 0.194
254
+ - type: precision_at_3
255
+ value: 20.076
256
+ - type: precision_at_5
257
+ value: 14.449000000000002
258
+ - type: recall_at_1
259
+ value: 30.499
260
+ - type: recall_at_10
261
+ value: 60.328
262
+ - type: recall_at_100
263
+ value: 82.57900000000001
264
+ - type: recall_at_1000
265
+ value: 95.074
266
+ - type: recall_at_3
267
+ value: 44.17
268
+ - type: recall_at_5
269
+ value: 50.94
270
+ - task:
271
+ type: Retrieval
272
+ dataset:
273
+ type: BeIR/cqadupstack
274
+ name: MTEB CQADupstackEnglishRetrieval
275
+ config: default
276
+ split: test
277
+ metrics:
278
+ - type: map_at_1
279
+ value: 30.613
280
+ - type: map_at_10
281
+ value: 40.781
282
+ - type: map_at_100
283
+ value: 42.018
284
+ - type: map_at_1000
285
+ value: 42.132999999999996
286
+ - type: map_at_3
287
+ value: 37.816
288
+ - type: map_at_5
289
+ value: 39.389
290
+ - type: mrr_at_1
291
+ value: 38.408
292
+ - type: mrr_at_10
293
+ value: 46.631
294
+ - type: mrr_at_100
295
+ value: 47.332
296
+ - type: mrr_at_1000
297
+ value: 47.368
298
+ - type: mrr_at_3
299
+ value: 44.384
300
+ - type: mrr_at_5
301
+ value: 45.661
302
+ - type: ndcg_at_1
303
+ value: 38.408
304
+ - type: ndcg_at_10
305
+ value: 46.379999999999995
306
+ - type: ndcg_at_100
307
+ value: 50.81
308
+ - type: ndcg_at_1000
309
+ value: 52.663000000000004
310
+ - type: ndcg_at_3
311
+ value: 42.18
312
+ - type: ndcg_at_5
313
+ value: 43.974000000000004
314
+ - type: precision_at_1
315
+ value: 38.408
316
+ - type: precision_at_10
317
+ value: 8.656
318
+ - type: precision_at_100
319
+ value: 1.3860000000000001
320
+ - type: precision_at_1000
321
+ value: 0.184
322
+ - type: precision_at_3
323
+ value: 20.276
324
+ - type: precision_at_5
325
+ value: 14.241999999999999
326
+ - type: recall_at_1
327
+ value: 30.613
328
+ - type: recall_at_10
329
+ value: 56.44
330
+ - type: recall_at_100
331
+ value: 75.044
332
+ - type: recall_at_1000
333
+ value: 86.426
334
+ - type: recall_at_3
335
+ value: 43.766
336
+ - type: recall_at_5
337
+ value: 48.998000000000005
338
+ - task:
339
+ type: Retrieval
340
+ dataset:
341
+ type: BeIR/cqadupstack
342
+ name: MTEB CQADupstackGamingRetrieval
343
+ config: default
344
+ split: test
345
+ metrics:
346
+ - type: map_at_1
347
+ value: 37.370999999999995
348
+ - type: map_at_10
349
+ value: 49.718
350
+ - type: map_at_100
351
+ value: 50.737
352
+ - type: map_at_1000
353
+ value: 50.79
354
+ - type: map_at_3
355
+ value: 46.231
356
+ - type: map_at_5
357
+ value: 48.329
358
+ - type: mrr_at_1
359
+ value: 42.884
360
+ - type: mrr_at_10
361
+ value: 53.176
362
+ - type: mrr_at_100
363
+ value: 53.81700000000001
364
+ - type: mrr_at_1000
365
+ value: 53.845
366
+ - type: mrr_at_3
367
+ value: 50.199000000000005
368
+ - type: mrr_at_5
369
+ value: 52.129999999999995
370
+ - type: ndcg_at_1
371
+ value: 42.884
372
+ - type: ndcg_at_10
373
+ value: 55.826
374
+ - type: ndcg_at_100
375
+ value: 59.93000000000001
376
+ - type: ndcg_at_1000
377
+ value: 61.013
378
+ - type: ndcg_at_3
379
+ value: 49.764
380
+ - type: ndcg_at_5
381
+ value: 53.025999999999996
382
+ - type: precision_at_1
383
+ value: 42.884
384
+ - type: precision_at_10
385
+ value: 9.046999999999999
386
+ - type: precision_at_100
387
+ value: 1.212
388
+ - type: precision_at_1000
389
+ value: 0.135
390
+ - type: precision_at_3
391
+ value: 22.131999999999998
392
+ - type: precision_at_5
393
+ value: 15.524
394
+ - type: recall_at_1
395
+ value: 37.370999999999995
396
+ - type: recall_at_10
397
+ value: 70.482
398
+ - type: recall_at_100
399
+ value: 88.425
400
+ - type: recall_at_1000
401
+ value: 96.03399999999999
402
+ - type: recall_at_3
403
+ value: 54.43
404
+ - type: recall_at_5
405
+ value: 62.327999999999996
406
+ - task:
407
+ type: Retrieval
408
+ dataset:
409
+ type: BeIR/cqadupstack
410
+ name: MTEB CQADupstackGisRetrieval
411
+ config: default
412
+ split: test
413
+ metrics:
414
+ - type: map_at_1
415
+ value: 22.875999999999998
416
+ - type: map_at_10
417
+ value: 31.715
418
+ - type: map_at_100
419
+ value: 32.847
420
+ - type: map_at_1000
421
+ value: 32.922000000000004
422
+ - type: map_at_3
423
+ value: 29.049999999999997
424
+ - type: map_at_5
425
+ value: 30.396
426
+ - type: mrr_at_1
427
+ value: 24.52
428
+ - type: mrr_at_10
429
+ value: 33.497
430
+ - type: mrr_at_100
431
+ value: 34.455000000000005
432
+ - type: mrr_at_1000
433
+ value: 34.510000000000005
434
+ - type: mrr_at_3
435
+ value: 30.791
436
+ - type: mrr_at_5
437
+ value: 32.175
438
+ - type: ndcg_at_1
439
+ value: 24.52
440
+ - type: ndcg_at_10
441
+ value: 36.95
442
+ - type: ndcg_at_100
443
+ value: 42.238
444
+ - type: ndcg_at_1000
445
+ value: 44.147999999999996
446
+ - type: ndcg_at_3
447
+ value: 31.435000000000002
448
+ - type: ndcg_at_5
449
+ value: 33.839000000000006
450
+ - type: precision_at_1
451
+ value: 24.52
452
+ - type: precision_at_10
453
+ value: 5.9319999999999995
454
+ - type: precision_at_100
455
+ value: 0.901
456
+ - type: precision_at_1000
457
+ value: 0.11
458
+ - type: precision_at_3
459
+ value: 13.446
460
+ - type: precision_at_5
461
+ value: 9.469
462
+ - type: recall_at_1
463
+ value: 22.875999999999998
464
+ - type: recall_at_10
465
+ value: 51.38
466
+ - type: recall_at_100
467
+ value: 75.31099999999999
468
+ - type: recall_at_1000
469
+ value: 89.718
470
+ - type: recall_at_3
471
+ value: 36.26
472
+ - type: recall_at_5
473
+ value: 42.248999999999995
474
+ - task:
475
+ type: Retrieval
476
+ dataset:
477
+ type: BeIR/cqadupstack
478
+ name: MTEB CQADupstackMathematicaRetrieval
479
+ config: default
480
+ split: test
481
+ metrics:
482
+ - type: map_at_1
483
+ value: 14.984
484
+ - type: map_at_10
485
+ value: 23.457
486
+ - type: map_at_100
487
+ value: 24.723
488
+ - type: map_at_1000
489
+ value: 24.846
490
+ - type: map_at_3
491
+ value: 20.873
492
+ - type: map_at_5
493
+ value: 22.357
494
+ - type: mrr_at_1
495
+ value: 18.159
496
+ - type: mrr_at_10
497
+ value: 27.431
498
+ - type: mrr_at_100
499
+ value: 28.449
500
+ - type: mrr_at_1000
501
+ value: 28.52
502
+ - type: mrr_at_3
503
+ value: 24.979000000000003
504
+ - type: mrr_at_5
505
+ value: 26.447
506
+ - type: ndcg_at_1
507
+ value: 18.159
508
+ - type: ndcg_at_10
509
+ value: 28.627999999999997
510
+ - type: ndcg_at_100
511
+ value: 34.741
512
+ - type: ndcg_at_1000
513
+ value: 37.516
514
+ - type: ndcg_at_3
515
+ value: 23.902
516
+ - type: ndcg_at_5
517
+ value: 26.294
518
+ - type: precision_at_1
519
+ value: 18.159
520
+ - type: precision_at_10
521
+ value: 5.485
522
+ - type: precision_at_100
523
+ value: 0.985
524
+ - type: precision_at_1000
525
+ value: 0.136
526
+ - type: precision_at_3
527
+ value: 11.774
528
+ - type: precision_at_5
529
+ value: 8.731
530
+ - type: recall_at_1
531
+ value: 14.984
532
+ - type: recall_at_10
533
+ value: 40.198
534
+ - type: recall_at_100
535
+ value: 67.11500000000001
536
+ - type: recall_at_1000
537
+ value: 86.497
538
+ - type: recall_at_3
539
+ value: 27.639000000000003
540
+ - type: recall_at_5
541
+ value: 33.595000000000006
542
+ - task:
543
+ type: Retrieval
544
+ dataset:
545
+ type: BeIR/cqadupstack
546
+ name: MTEB CQADupstackPhysicsRetrieval
547
+ config: default
548
+ split: test
549
+ metrics:
550
+ - type: map_at_1
551
+ value: 29.067
552
+ - type: map_at_10
553
+ value: 39.457
554
+ - type: map_at_100
555
+ value: 40.83
556
+ - type: map_at_1000
557
+ value: 40.94
558
+ - type: map_at_3
559
+ value: 35.995
560
+ - type: map_at_5
561
+ value: 38.159
562
+ - type: mrr_at_1
563
+ value: 34.937000000000005
564
+ - type: mrr_at_10
565
+ value: 44.755
566
+ - type: mrr_at_100
567
+ value: 45.549
568
+ - type: mrr_at_1000
569
+ value: 45.589
570
+ - type: mrr_at_3
571
+ value: 41.947
572
+ - type: mrr_at_5
573
+ value: 43.733
574
+ - type: ndcg_at_1
575
+ value: 34.937000000000005
576
+ - type: ndcg_at_10
577
+ value: 45.573
578
+ - type: ndcg_at_100
579
+ value: 51.266999999999996
580
+ - type: ndcg_at_1000
581
+ value: 53.184
582
+ - type: ndcg_at_3
583
+ value: 39.961999999999996
584
+ - type: ndcg_at_5
585
+ value: 43.02
586
+ - type: precision_at_1
587
+ value: 34.937000000000005
588
+ - type: precision_at_10
589
+ value: 8.296000000000001
590
+ - type: precision_at_100
591
+ value: 1.32
592
+ - type: precision_at_1000
593
+ value: 0.167
594
+ - type: precision_at_3
595
+ value: 18.8
596
+ - type: precision_at_5
597
+ value: 13.763
598
+ - type: recall_at_1
599
+ value: 29.067
600
+ - type: recall_at_10
601
+ value: 58.298
602
+ - type: recall_at_100
603
+ value: 82.25099999999999
604
+ - type: recall_at_1000
605
+ value: 94.476
606
+ - type: recall_at_3
607
+ value: 42.984
608
+ - type: recall_at_5
609
+ value: 50.658
610
+ - task:
611
+ type: Retrieval
612
+ dataset:
613
+ type: BeIR/cqadupstack
614
+ name: MTEB CQADupstackProgrammersRetrieval
615
+ config: default
616
+ split: test
617
+ metrics:
618
+ - type: map_at_1
619
+ value: 25.985999999999997
620
+ - type: map_at_10
621
+ value: 35.746
622
+ - type: map_at_100
623
+ value: 37.067
624
+ - type: map_at_1000
625
+ value: 37.191
626
+ - type: map_at_3
627
+ value: 32.599000000000004
628
+ - type: map_at_5
629
+ value: 34.239000000000004
630
+ - type: mrr_at_1
631
+ value: 31.735000000000003
632
+ - type: mrr_at_10
633
+ value: 40.515
634
+ - type: mrr_at_100
635
+ value: 41.459
636
+ - type: mrr_at_1000
637
+ value: 41.516
638
+ - type: mrr_at_3
639
+ value: 37.938
640
+ - type: mrr_at_5
641
+ value: 39.25
642
+ - type: ndcg_at_1
643
+ value: 31.735000000000003
644
+ - type: ndcg_at_10
645
+ value: 41.484
646
+ - type: ndcg_at_100
647
+ value: 47.047
648
+ - type: ndcg_at_1000
649
+ value: 49.427
650
+ - type: ndcg_at_3
651
+ value: 36.254999999999995
652
+ - type: ndcg_at_5
653
+ value: 38.375
654
+ - type: precision_at_1
655
+ value: 31.735000000000003
656
+ - type: precision_at_10
657
+ value: 7.66
658
+ - type: precision_at_100
659
+ value: 1.234
660
+ - type: precision_at_1000
661
+ value: 0.16
662
+ - type: precision_at_3
663
+ value: 17.427999999999997
664
+ - type: precision_at_5
665
+ value: 12.328999999999999
666
+ - type: recall_at_1
667
+ value: 25.985999999999997
668
+ - type: recall_at_10
669
+ value: 53.761
670
+ - type: recall_at_100
671
+ value: 77.149
672
+ - type: recall_at_1000
673
+ value: 93.342
674
+ - type: recall_at_3
675
+ value: 39.068000000000005
676
+ - type: recall_at_5
677
+ value: 44.693
678
+ - task:
679
+ type: Retrieval
680
+ dataset:
681
+ type: BeIR/cqadupstack
682
+ name: MTEB CQADupstackRetrieval
683
+ config: default
684
+ split: test
685
+ metrics:
686
+ - type: map_at_1
687
+ value: 24.949749999999998
688
+ - type: map_at_10
689
+ value: 34.04991666666667
690
+ - type: map_at_100
691
+ value: 35.26825
692
+ - type: map_at_1000
693
+ value: 35.38316666666667
694
+ - type: map_at_3
695
+ value: 31.181333333333335
696
+ - type: map_at_5
697
+ value: 32.77391666666667
698
+ - type: mrr_at_1
699
+ value: 29.402833333333334
700
+ - type: mrr_at_10
701
+ value: 38.01633333333333
702
+ - type: mrr_at_100
703
+ value: 38.88033333333334
704
+ - type: mrr_at_1000
705
+ value: 38.938500000000005
706
+ - type: mrr_at_3
707
+ value: 35.5175
708
+ - type: mrr_at_5
709
+ value: 36.93808333333333
710
+ - type: ndcg_at_1
711
+ value: 29.402833333333334
712
+ - type: ndcg_at_10
713
+ value: 39.403166666666664
714
+ - type: ndcg_at_100
715
+ value: 44.66408333333333
716
+ - type: ndcg_at_1000
717
+ value: 46.96283333333333
718
+ - type: ndcg_at_3
719
+ value: 34.46633333333334
720
+ - type: ndcg_at_5
721
+ value: 36.78441666666667
722
+ - type: precision_at_1
723
+ value: 29.402833333333334
724
+ - type: precision_at_10
725
+ value: 6.965833333333333
726
+ - type: precision_at_100
727
+ value: 1.1330833333333334
728
+ - type: precision_at_1000
729
+ value: 0.15158333333333335
730
+ - type: precision_at_3
731
+ value: 15.886666666666665
732
+ - type: precision_at_5
733
+ value: 11.360416666666667
734
+ - type: recall_at_1
735
+ value: 24.949749999999998
736
+ - type: recall_at_10
737
+ value: 51.29325
738
+ - type: recall_at_100
739
+ value: 74.3695
740
+ - type: recall_at_1000
741
+ value: 90.31299999999999
742
+ - type: recall_at_3
743
+ value: 37.580083333333334
744
+ - type: recall_at_5
745
+ value: 43.529666666666664
746
+ - task:
747
+ type: Retrieval
748
+ dataset:
749
+ type: BeIR/cqadupstack
750
+ name: MTEB CQADupstackStatsRetrieval
751
+ config: default
752
+ split: test
753
+ metrics:
754
+ - type: map_at_1
755
+ value: 22.081999999999997
756
+ - type: map_at_10
757
+ value: 29.215999999999998
758
+ - type: map_at_100
759
+ value: 30.163
760
+ - type: map_at_1000
761
+ value: 30.269000000000002
762
+ - type: map_at_3
763
+ value: 26.942
764
+ - type: map_at_5
765
+ value: 28.236
766
+ - type: mrr_at_1
767
+ value: 24.847
768
+ - type: mrr_at_10
769
+ value: 31.918999999999997
770
+ - type: mrr_at_100
771
+ value: 32.817
772
+ - type: mrr_at_1000
773
+ value: 32.897
774
+ - type: mrr_at_3
775
+ value: 29.831000000000003
776
+ - type: mrr_at_5
777
+ value: 31.019999999999996
778
+ - type: ndcg_at_1
779
+ value: 24.847
780
+ - type: ndcg_at_10
781
+ value: 33.4
782
+ - type: ndcg_at_100
783
+ value: 38.354
784
+ - type: ndcg_at_1000
785
+ value: 41.045
786
+ - type: ndcg_at_3
787
+ value: 29.236
788
+ - type: ndcg_at_5
789
+ value: 31.258000000000003
790
+ - type: precision_at_1
791
+ value: 24.847
792
+ - type: precision_at_10
793
+ value: 5.353
794
+ - type: precision_at_100
795
+ value: 0.853
796
+ - type: precision_at_1000
797
+ value: 0.116
798
+ - type: precision_at_3
799
+ value: 12.679000000000002
800
+ - type: precision_at_5
801
+ value: 8.988
802
+ - type: recall_at_1
803
+ value: 22.081999999999997
804
+ - type: recall_at_10
805
+ value: 43.505
806
+ - type: recall_at_100
807
+ value: 66.45400000000001
808
+ - type: recall_at_1000
809
+ value: 86.378
810
+ - type: recall_at_3
811
+ value: 32.163000000000004
812
+ - type: recall_at_5
813
+ value: 37.059999999999995
814
+ - task:
815
+ type: Retrieval
816
+ dataset:
817
+ type: BeIR/cqadupstack
818
+ name: MTEB CQADupstackTexRetrieval
819
+ config: default
820
+ split: test
821
+ metrics:
822
+ - type: map_at_1
823
+ value: 15.540000000000001
824
+ - type: map_at_10
825
+ value: 22.362000000000002
826
+ - type: map_at_100
827
+ value: 23.435
828
+ - type: map_at_1000
829
+ value: 23.564
830
+ - type: map_at_3
831
+ value: 20.143
832
+ - type: map_at_5
833
+ value: 21.324
834
+ - type: mrr_at_1
835
+ value: 18.892
836
+ - type: mrr_at_10
837
+ value: 25.942999999999998
838
+ - type: mrr_at_100
839
+ value: 26.883000000000003
840
+ - type: mrr_at_1000
841
+ value: 26.968999999999998
842
+ - type: mrr_at_3
843
+ value: 23.727
844
+ - type: mrr_at_5
845
+ value: 24.923000000000002
846
+ - type: ndcg_at_1
847
+ value: 18.892
848
+ - type: ndcg_at_10
849
+ value: 26.811
850
+ - type: ndcg_at_100
851
+ value: 32.066
852
+ - type: ndcg_at_1000
853
+ value: 35.166
854
+ - type: ndcg_at_3
855
+ value: 22.706
856
+ - type: ndcg_at_5
857
+ value: 24.508
858
+ - type: precision_at_1
859
+ value: 18.892
860
+ - type: precision_at_10
861
+ value: 4.942
862
+ - type: precision_at_100
863
+ value: 0.878
864
+ - type: precision_at_1000
865
+ value: 0.131
866
+ - type: precision_at_3
867
+ value: 10.748000000000001
868
+ - type: precision_at_5
869
+ value: 7.784000000000001
870
+ - type: recall_at_1
871
+ value: 15.540000000000001
872
+ - type: recall_at_10
873
+ value: 36.742999999999995
874
+ - type: recall_at_100
875
+ value: 60.525
876
+ - type: recall_at_1000
877
+ value: 82.57600000000001
878
+ - type: recall_at_3
879
+ value: 25.252000000000002
880
+ - type: recall_at_5
881
+ value: 29.872
882
+ - task:
883
+ type: Retrieval
884
+ dataset:
885
+ type: BeIR/cqadupstack
886
+ name: MTEB CQADupstackUnixRetrieval
887
+ config: default
888
+ split: test
889
+ metrics:
890
+ - type: map_at_1
891
+ value: 24.453
892
+ - type: map_at_10
893
+ value: 33.363
894
+ - type: map_at_100
895
+ value: 34.579
896
+ - type: map_at_1000
897
+ value: 34.686
898
+ - type: map_at_3
899
+ value: 30.583
900
+ - type: map_at_5
901
+ value: 32.118
902
+ - type: mrr_at_1
903
+ value: 28.918
904
+ - type: mrr_at_10
905
+ value: 37.675
906
+ - type: mrr_at_100
907
+ value: 38.567
908
+ - type: mrr_at_1000
909
+ value: 38.632
910
+ - type: mrr_at_3
911
+ value: 35.260999999999996
912
+ - type: mrr_at_5
913
+ value: 36.576
914
+ - type: ndcg_at_1
915
+ value: 28.918
916
+ - type: ndcg_at_10
917
+ value: 38.736
918
+ - type: ndcg_at_100
919
+ value: 44.261
920
+ - type: ndcg_at_1000
921
+ value: 46.72
922
+ - type: ndcg_at_3
923
+ value: 33.81
924
+ - type: ndcg_at_5
925
+ value: 36.009
926
+ - type: precision_at_1
927
+ value: 28.918
928
+ - type: precision_at_10
929
+ value: 6.586
930
+ - type: precision_at_100
931
+ value: 1.047
932
+ - type: precision_at_1000
933
+ value: 0.13699999999999998
934
+ - type: precision_at_3
935
+ value: 15.360999999999999
936
+ - type: precision_at_5
937
+ value: 10.857999999999999
938
+ - type: recall_at_1
939
+ value: 24.453
940
+ - type: recall_at_10
941
+ value: 50.885999999999996
942
+ - type: recall_at_100
943
+ value: 75.03
944
+ - type: recall_at_1000
945
+ value: 92.123
946
+ - type: recall_at_3
947
+ value: 37.138
948
+ - type: recall_at_5
949
+ value: 42.864999999999995
950
+ - task:
951
+ type: Retrieval
952
+ dataset:
953
+ type: BeIR/cqadupstack
954
+ name: MTEB CQADupstackWebmastersRetrieval
955
+ config: default
956
+ split: test
957
+ metrics:
958
+ - type: map_at_1
959
+ value: 24.57
960
+ - type: map_at_10
961
+ value: 33.672000000000004
962
+ - type: map_at_100
963
+ value: 35.244
964
+ - type: map_at_1000
965
+ value: 35.467
966
+ - type: map_at_3
967
+ value: 30.712
968
+ - type: map_at_5
969
+ value: 32.383
970
+ - type: mrr_at_1
971
+ value: 29.644
972
+ - type: mrr_at_10
973
+ value: 38.344
974
+ - type: mrr_at_100
975
+ value: 39.219
976
+ - type: mrr_at_1000
977
+ value: 39.282000000000004
978
+ - type: mrr_at_3
979
+ value: 35.771
980
+ - type: mrr_at_5
981
+ value: 37.273
982
+ - type: ndcg_at_1
983
+ value: 29.644
984
+ - type: ndcg_at_10
985
+ value: 39.567
986
+ - type: ndcg_at_100
987
+ value: 45.097
988
+ - type: ndcg_at_1000
989
+ value: 47.923
990
+ - type: ndcg_at_3
991
+ value: 34.768
992
+ - type: ndcg_at_5
993
+ value: 37.122
994
+ - type: precision_at_1
995
+ value: 29.644
996
+ - type: precision_at_10
997
+ value: 7.5889999999999995
998
+ - type: precision_at_100
999
+ value: 1.478
1000
+ - type: precision_at_1000
1001
+ value: 0.23500000000000001
1002
+ - type: precision_at_3
1003
+ value: 16.337
1004
+ - type: precision_at_5
1005
+ value: 12.055
1006
+ - type: recall_at_1
1007
+ value: 24.57
1008
+ - type: recall_at_10
1009
+ value: 51.00900000000001
1010
+ - type: recall_at_100
1011
+ value: 75.423
1012
+ - type: recall_at_1000
1013
+ value: 93.671
1014
+ - type: recall_at_3
1015
+ value: 36.925999999999995
1016
+ - type: recall_at_5
1017
+ value: 43.245
1018
+ - task:
1019
+ type: Retrieval
1020
+ dataset:
1021
+ type: BeIR/cqadupstack
1022
+ name: MTEB CQADupstackWordpressRetrieval
1023
+ config: default
1024
+ split: test
1025
+ metrics:
1026
+ - type: map_at_1
1027
+ value: 21.356
1028
+ - type: map_at_10
1029
+ value: 27.904
1030
+ - type: map_at_100
1031
+ value: 28.938000000000002
1032
+ - type: map_at_1000
1033
+ value: 29.036
1034
+ - type: map_at_3
1035
+ value: 25.726
1036
+ - type: map_at_5
1037
+ value: 26.935
1038
+ - type: mrr_at_1
1039
+ value: 22.551
1040
+ - type: mrr_at_10
1041
+ value: 29.259
1042
+ - type: mrr_at_100
1043
+ value: 30.272
1044
+ - type: mrr_at_1000
1045
+ value: 30.348000000000003
1046
+ - type: mrr_at_3
1047
+ value: 27.295
1048
+ - type: mrr_at_5
1049
+ value: 28.358
1050
+ - type: ndcg_at_1
1051
+ value: 22.551
1052
+ - type: ndcg_at_10
1053
+ value: 31.817
1054
+ - type: ndcg_at_100
1055
+ value: 37.164
1056
+ - type: ndcg_at_1000
1057
+ value: 39.82
1058
+ - type: ndcg_at_3
1059
+ value: 27.595999999999997
1060
+ - type: ndcg_at_5
1061
+ value: 29.568
1062
+ - type: precision_at_1
1063
+ value: 22.551
1064
+ - type: precision_at_10
1065
+ value: 4.917
1066
+ - type: precision_at_100
1067
+ value: 0.828
1068
+ - type: precision_at_1000
1069
+ value: 0.11399999999999999
1070
+ - type: precision_at_3
1071
+ value: 11.583
1072
+ - type: precision_at_5
1073
+ value: 8.133
1074
+ - type: recall_at_1
1075
+ value: 21.356
1076
+ - type: recall_at_10
1077
+ value: 42.489
1078
+ - type: recall_at_100
1079
+ value: 67.128
1080
+ - type: recall_at_1000
1081
+ value: 87.441
1082
+ - type: recall_at_3
1083
+ value: 31.165
1084
+ - type: recall_at_5
1085
+ value: 35.853
1086
+ - task:
1087
+ type: Retrieval
1088
+ dataset:
1089
+ type: climate-fever
1090
+ name: MTEB ClimateFEVER
1091
+ config: default
1092
+ split: test
1093
+ metrics:
1094
+ - type: map_at_1
1095
+ value: 12.306000000000001
1096
+ - type: map_at_10
1097
+ value: 21.523
1098
+ - type: map_at_100
1099
+ value: 23.358
1100
+ - type: map_at_1000
1101
+ value: 23.541
1102
+ - type: map_at_3
1103
+ value: 17.809
1104
+ - type: map_at_5
1105
+ value: 19.631
1106
+ - type: mrr_at_1
1107
+ value: 27.948
1108
+ - type: mrr_at_10
1109
+ value: 40.355000000000004
1110
+ - type: mrr_at_100
1111
+ value: 41.166000000000004
1112
+ - type: mrr_at_1000
1113
+ value: 41.203
1114
+ - type: mrr_at_3
1115
+ value: 36.819
1116
+ - type: mrr_at_5
1117
+ value: 38.958999999999996
1118
+ - type: ndcg_at_1
1119
+ value: 27.948
1120
+ - type: ndcg_at_10
1121
+ value: 30.462
1122
+ - type: ndcg_at_100
1123
+ value: 37.473
1124
+ - type: ndcg_at_1000
1125
+ value: 40.717999999999996
1126
+ - type: ndcg_at_3
1127
+ value: 24.646
1128
+ - type: ndcg_at_5
1129
+ value: 26.642
1130
+ - type: precision_at_1
1131
+ value: 27.948
1132
+ - type: precision_at_10
1133
+ value: 9.648
1134
+ - type: precision_at_100
1135
+ value: 1.7239999999999998
1136
+ - type: precision_at_1000
1137
+ value: 0.232
1138
+ - type: precision_at_3
1139
+ value: 18.48
1140
+ - type: precision_at_5
1141
+ value: 14.293
1142
+ - type: recall_at_1
1143
+ value: 12.306000000000001
1144
+ - type: recall_at_10
1145
+ value: 37.181
1146
+ - type: recall_at_100
1147
+ value: 61.148
1148
+ - type: recall_at_1000
1149
+ value: 79.401
1150
+ - type: recall_at_3
1151
+ value: 22.883
1152
+ - type: recall_at_5
1153
+ value: 28.59
1154
+ - task:
1155
+ type: Retrieval
1156
+ dataset:
1157
+ type: dbpedia-entity
1158
+ name: MTEB DBPedia
1159
+ config: default
1160
+ split: test
1161
+ metrics:
1162
+ - type: map_at_1
1163
+ value: 9.357
1164
+ - type: map_at_10
1165
+ value: 18.849
1166
+ - type: map_at_100
1167
+ value: 25.369000000000003
1168
+ - type: map_at_1000
1169
+ value: 26.950000000000003
1170
+ - type: map_at_3
1171
+ value: 13.625000000000002
1172
+ - type: map_at_5
1173
+ value: 15.956999999999999
1174
+ - type: mrr_at_1
1175
+ value: 67.75
1176
+ - type: mrr_at_10
1177
+ value: 74.734
1178
+ - type: mrr_at_100
1179
+ value: 75.1
1180
+ - type: mrr_at_1000
1181
+ value: 75.10900000000001
1182
+ - type: mrr_at_3
1183
+ value: 73.542
1184
+ - type: mrr_at_5
1185
+ value: 74.167
1186
+ - type: ndcg_at_1
1187
+ value: 55.375
1188
+ - type: ndcg_at_10
1189
+ value: 39.873999999999995
1190
+ - type: ndcg_at_100
1191
+ value: 43.098
1192
+ - type: ndcg_at_1000
1193
+ value: 50.69200000000001
1194
+ - type: ndcg_at_3
1195
+ value: 44.856
1196
+ - type: ndcg_at_5
1197
+ value: 42.138999999999996
1198
+ - type: precision_at_1
1199
+ value: 67.75
1200
+ - type: precision_at_10
1201
+ value: 31.1
1202
+ - type: precision_at_100
1203
+ value: 9.303
1204
+ - type: precision_at_1000
1205
+ value: 2.0060000000000002
1206
+ - type: precision_at_3
1207
+ value: 48.25
1208
+ - type: precision_at_5
1209
+ value: 40.949999999999996
1210
+ - type: recall_at_1
1211
+ value: 9.357
1212
+ - type: recall_at_10
1213
+ value: 23.832
1214
+ - type: recall_at_100
1215
+ value: 47.906
1216
+ - type: recall_at_1000
1217
+ value: 71.309
1218
+ - type: recall_at_3
1219
+ value: 14.512
1220
+ - type: recall_at_5
1221
+ value: 18.3
1222
+ - task:
1223
+ type: Classification
1224
+ dataset:
1225
+ type: mteb/emotion
1226
+ name: MTEB EmotionClassification
1227
+ config: default
1228
+ split: test
1229
+ metrics:
1230
+ - type: accuracy
1231
+ value: 49.655
1232
+ - type: f1
1233
+ value: 45.51976190938951
1234
+ - task:
1235
+ type: Retrieval
1236
+ dataset:
1237
+ type: fever
1238
+ name: MTEB FEVER
1239
+ config: default
1240
+ split: test
1241
+ metrics:
1242
+ - type: map_at_1
1243
+ value: 62.739999999999995
1244
+ - type: map_at_10
1245
+ value: 73.07000000000001
1246
+ - type: map_at_100
1247
+ value: 73.398
1248
+ - type: map_at_1000
1249
+ value: 73.41
1250
+ - type: map_at_3
1251
+ value: 71.33800000000001
1252
+ - type: map_at_5
1253
+ value: 72.423
1254
+ - type: mrr_at_1
1255
+ value: 67.777
1256
+ - type: mrr_at_10
1257
+ value: 77.873
1258
+ - type: mrr_at_100
1259
+ value: 78.091
1260
+ - type: mrr_at_1000
1261
+ value: 78.094
1262
+ - type: mrr_at_3
1263
+ value: 76.375
1264
+ - type: mrr_at_5
1265
+ value: 77.316
1266
+ - type: ndcg_at_1
1267
+ value: 67.777
1268
+ - type: ndcg_at_10
1269
+ value: 78.24
1270
+ - type: ndcg_at_100
1271
+ value: 79.557
1272
+ - type: ndcg_at_1000
1273
+ value: 79.814
1274
+ - type: ndcg_at_3
1275
+ value: 75.125
1276
+ - type: ndcg_at_5
1277
+ value: 76.834
1278
+ - type: precision_at_1
1279
+ value: 67.777
1280
+ - type: precision_at_10
1281
+ value: 9.832
1282
+ - type: precision_at_100
1283
+ value: 1.061
1284
+ - type: precision_at_1000
1285
+ value: 0.11
1286
+ - type: precision_at_3
1287
+ value: 29.433
1288
+ - type: precision_at_5
1289
+ value: 18.665000000000003
1290
+ - type: recall_at_1
1291
+ value: 62.739999999999995
1292
+ - type: recall_at_10
1293
+ value: 89.505
1294
+ - type: recall_at_100
1295
+ value: 95.102
1296
+ - type: recall_at_1000
1297
+ value: 96.825
1298
+ - type: recall_at_3
1299
+ value: 81.028
1300
+ - type: recall_at_5
1301
+ value: 85.28099999999999
1302
+ - task:
1303
+ type: Retrieval
1304
+ dataset:
1305
+ type: fiqa
1306
+ name: MTEB FiQA2018
1307
+ config: default
1308
+ split: test
1309
+ metrics:
1310
+ - type: map_at_1
1311
+ value: 18.467
1312
+ - type: map_at_10
1313
+ value: 30.020999999999997
1314
+ - type: map_at_100
1315
+ value: 31.739
1316
+ - type: map_at_1000
1317
+ value: 31.934
1318
+ - type: map_at_3
1319
+ value: 26.003
1320
+ - type: map_at_5
1321
+ value: 28.338
1322
+ - type: mrr_at_1
1323
+ value: 35.339999999999996
1324
+ - type: mrr_at_10
1325
+ value: 44.108999999999995
1326
+ - type: mrr_at_100
1327
+ value: 44.993
1328
+ - type: mrr_at_1000
1329
+ value: 45.042
1330
+ - type: mrr_at_3
1331
+ value: 41.667
1332
+ - type: mrr_at_5
1333
+ value: 43.14
1334
+ - type: ndcg_at_1
1335
+ value: 35.339999999999996
1336
+ - type: ndcg_at_10
1337
+ value: 37.202
1338
+ - type: ndcg_at_100
1339
+ value: 43.852999999999994
1340
+ - type: ndcg_at_1000
1341
+ value: 47.235
1342
+ - type: ndcg_at_3
1343
+ value: 33.5
1344
+ - type: ndcg_at_5
1345
+ value: 34.985
1346
+ - type: precision_at_1
1347
+ value: 35.339999999999996
1348
+ - type: precision_at_10
1349
+ value: 10.247
1350
+ - type: precision_at_100
1351
+ value: 1.7149999999999999
1352
+ - type: precision_at_1000
1353
+ value: 0.232
1354
+ - type: precision_at_3
1355
+ value: 22.222
1356
+ - type: precision_at_5
1357
+ value: 16.573999999999998
1358
+ - type: recall_at_1
1359
+ value: 18.467
1360
+ - type: recall_at_10
1361
+ value: 44.080999999999996
1362
+ - type: recall_at_100
1363
+ value: 68.72200000000001
1364
+ - type: recall_at_1000
1365
+ value: 89.087
1366
+ - type: recall_at_3
1367
+ value: 30.567
1368
+ - type: recall_at_5
1369
+ value: 36.982
1370
+ - task:
1371
+ type: Retrieval
1372
+ dataset:
1373
+ type: hotpotqa
1374
+ name: MTEB HotpotQA
1375
+ config: default
1376
+ split: test
1377
+ metrics:
1378
+ - type: map_at_1
1379
+ value: 35.726
1380
+ - type: map_at_10
1381
+ value: 50.207
1382
+ - type: map_at_100
1383
+ value: 51.05499999999999
1384
+ - type: map_at_1000
1385
+ value: 51.12799999999999
1386
+ - type: map_at_3
1387
+ value: 47.576
1388
+ - type: map_at_5
1389
+ value: 49.172
1390
+ - type: mrr_at_1
1391
+ value: 71.452
1392
+ - type: mrr_at_10
1393
+ value: 77.41900000000001
1394
+ - type: mrr_at_100
1395
+ value: 77.711
1396
+ - type: mrr_at_1000
1397
+ value: 77.723
1398
+ - type: mrr_at_3
1399
+ value: 76.39399999999999
1400
+ - type: mrr_at_5
1401
+ value: 77.00099999999999
1402
+ - type: ndcg_at_1
1403
+ value: 71.452
1404
+ - type: ndcg_at_10
1405
+ value: 59.260999999999996
1406
+ - type: ndcg_at_100
1407
+ value: 62.424
1408
+ - type: ndcg_at_1000
1409
+ value: 63.951
1410
+ - type: ndcg_at_3
1411
+ value: 55.327000000000005
1412
+ - type: ndcg_at_5
1413
+ value: 57.416999999999994
1414
+ - type: precision_at_1
1415
+ value: 71.452
1416
+ - type: precision_at_10
1417
+ value: 12.061
1418
+ - type: precision_at_100
1419
+ value: 1.455
1420
+ - type: precision_at_1000
1421
+ value: 0.166
1422
+ - type: precision_at_3
1423
+ value: 34.36
1424
+ - type: precision_at_5
1425
+ value: 22.266
1426
+ - type: recall_at_1
1427
+ value: 35.726
1428
+ - type: recall_at_10
1429
+ value: 60.304
1430
+ - type: recall_at_100
1431
+ value: 72.75500000000001
1432
+ - type: recall_at_1000
1433
+ value: 82.978
1434
+ - type: recall_at_3
1435
+ value: 51.54
1436
+ - type: recall_at_5
1437
+ value: 55.665
1438
+ - task:
1439
+ type: Classification
1440
+ dataset:
1441
+ type: mteb/imdb
1442
+ name: MTEB ImdbClassification
1443
+ config: default
1444
+ split: test
1445
+ metrics:
1446
+ - type: accuracy
1447
+ value: 66.63759999999999
1448
+ - type: ap
1449
+ value: 61.48938261286748
1450
+ - type: f1
1451
+ value: 66.35089269264965
1452
+ - task:
1453
+ type: Retrieval
1454
+ dataset:
1455
+ type: msmarco
1456
+ name: MTEB MSMARCO
1457
+ config: default
1458
+ split: validation
1459
+ metrics:
1460
+ - type: map_at_1
1461
+ value: 20.842
1462
+ - type: map_at_10
1463
+ value: 32.992
1464
+ - type: map_at_100
1465
+ value: 34.236
1466
+ - type: map_at_1000
1467
+ value: 34.286
1468
+ - type: map_at_3
1469
+ value: 29.049000000000003
1470
+ - type: map_at_5
1471
+ value: 31.391999999999996
1472
+ - type: mrr_at_1
1473
+ value: 21.375
1474
+ - type: mrr_at_10
1475
+ value: 33.581
1476
+ - type: mrr_at_100
1477
+ value: 34.760000000000005
1478
+ - type: mrr_at_1000
1479
+ value: 34.803
1480
+ - type: mrr_at_3
1481
+ value: 29.704000000000004
1482
+ - type: mrr_at_5
1483
+ value: 32.015
1484
+ - type: ndcg_at_1
1485
+ value: 21.375
1486
+ - type: ndcg_at_10
1487
+ value: 39.905
1488
+ - type: ndcg_at_100
1489
+ value: 45.843
1490
+ - type: ndcg_at_1000
1491
+ value: 47.083999999999996
1492
+ - type: ndcg_at_3
1493
+ value: 31.918999999999997
1494
+ - type: ndcg_at_5
1495
+ value: 36.107
1496
+ - type: precision_at_1
1497
+ value: 21.375
1498
+ - type: precision_at_10
1499
+ value: 6.393
1500
+ - type: precision_at_100
1501
+ value: 0.935
1502
+ - type: precision_at_1000
1503
+ value: 0.104
1504
+ - type: precision_at_3
1505
+ value: 13.663
1506
+ - type: precision_at_5
1507
+ value: 10.324
1508
+ - type: recall_at_1
1509
+ value: 20.842
1510
+ - type: recall_at_10
1511
+ value: 61.17
1512
+ - type: recall_at_100
1513
+ value: 88.518
1514
+ - type: recall_at_1000
1515
+ value: 97.993
1516
+ - type: recall_at_3
1517
+ value: 39.571
1518
+ - type: recall_at_5
1519
+ value: 49.653999999999996
1520
+ - task:
1521
+ type: Classification
1522
+ dataset:
1523
+ type: mteb/mtop_domain
1524
+ name: MTEB MTOPDomainClassification (en)
1525
+ config: en
1526
+ split: test
1527
+ metrics:
1528
+ - type: accuracy
1529
+ value: 93.46557227542178
1530
+ - type: f1
1531
+ value: 92.87345917772146
1532
+ - task:
1533
+ type: Classification
1534
+ dataset:
1535
+ type: mteb/mtop_intent
1536
+ name: MTEB MTOPIntentClassification (en)
1537
+ config: en
1538
+ split: test
1539
  metrics:
1540
  - type: accuracy
1541
+ value: 72.42134062927497
1542
+ - type: f1
1543
+ value: 55.03624810959269
1544
+ - task:
1545
+ type: Classification
1546
+ dataset:
1547
+ type: mteb/amazon_massive_intent
1548
+ name: MTEB MassiveIntentClassification (en)
1549
+ config: en
1550
+ split: test
1551
+ metrics:
1552
+ - type: accuracy
1553
+ value: 70.3866845998655
1554
+ - type: f1
1555
+ value: 68.9674519872921
1556
+ - task:
1557
+ type: Classification
1558
+ dataset:
1559
+ type: mteb/amazon_massive_scenario
1560
+ name: MTEB MassiveScenarioClassification (en)
1561
+ config: en
1562
+ split: test
1563
+ metrics:
1564
+ - type: accuracy
1565
+ value: 76.27774041694687
1566
+ - type: f1
1567
+ value: 76.72936190462792
1568
+ - task:
1569
+ type: Clustering
1570
+ dataset:
1571
+ type: mteb/medrxiv-clustering-p2p
1572
+ name: MTEB MedrxivClusteringP2P
1573
+ config: default
1574
+ split: test
1575
+ metrics:
1576
+ - type: v_measure
1577
+ value: 31.511745925773337
1578
+ - task:
1579
+ type: Clustering
1580
+ dataset:
1581
+ type: mteb/medrxiv-clustering-s2s
1582
+ name: MTEB MedrxivClusteringS2S
1583
+ config: default
1584
+ split: test
1585
+ metrics:
1586
+ - type: v_measure
1587
+ value: 28.764235987575365
1588
+ - task:
1589
+ type: Reranking
1590
+ dataset:
1591
+ type: mteb/mind_small
1592
+ name: MTEB MindSmallReranking
1593
+ config: default
1594
+ split: test
1595
+ metrics:
1596
+ - type: map
1597
+ value: 32.29353136386601
1598
+ - type: mrr
1599
+ value: 33.536774455851685
1600
+ - task:
1601
+ type: Retrieval
1602
+ dataset:
1603
+ type: nfcorpus
1604
+ name: MTEB NFCorpus
1605
+ config: default
1606
+ split: test
1607
+ metrics:
1608
+ - type: map_at_1
1609
+ value: 5.702
1610
+ - type: map_at_10
1611
+ value: 13.642000000000001
1612
+ - type: map_at_100
1613
+ value: 17.503
1614
+ - type: map_at_1000
1615
+ value: 19.126
1616
+ - type: map_at_3
1617
+ value: 9.748
1618
+ - type: map_at_5
1619
+ value: 11.642
1620
+ - type: mrr_at_1
1621
+ value: 45.82
1622
+ - type: mrr_at_10
1623
+ value: 54.821
1624
+ - type: mrr_at_100
1625
+ value: 55.422000000000004
1626
+ - type: mrr_at_1000
1627
+ value: 55.452999999999996
1628
+ - type: mrr_at_3
1629
+ value: 52.373999999999995
1630
+ - type: mrr_at_5
1631
+ value: 53.937000000000005
1632
+ - type: ndcg_at_1
1633
+ value: 44.272
1634
+ - type: ndcg_at_10
1635
+ value: 36.213
1636
+ - type: ndcg_at_100
1637
+ value: 33.829
1638
+ - type: ndcg_at_1000
1639
+ value: 42.557
1640
+ - type: ndcg_at_3
1641
+ value: 40.814
1642
+ - type: ndcg_at_5
1643
+ value: 39.562000000000005
1644
+ - type: precision_at_1
1645
+ value: 45.511
1646
+ - type: precision_at_10
1647
+ value: 27.214
1648
+ - type: precision_at_100
1649
+ value: 8.941
1650
+ - type: precision_at_1000
1651
+ value: 2.1870000000000003
1652
+ - type: precision_at_3
1653
+ value: 37.874
1654
+ - type: precision_at_5
1655
+ value: 34.489
1656
+ - type: recall_at_1
1657
+ value: 5.702
1658
+ - type: recall_at_10
1659
+ value: 17.638
1660
+ - type: recall_at_100
1661
+ value: 34.419
1662
+ - type: recall_at_1000
1663
+ value: 66.41
1664
+ - type: recall_at_3
1665
+ value: 10.914
1666
+ - type: recall_at_5
1667
+ value: 14.032
1668
+ - task:
1669
+ type: Retrieval
1670
+ dataset:
1671
+ type: nq
1672
+ name: MTEB NQ
1673
+ config: default
1674
+ split: test
1675
+ metrics:
1676
+ - type: map_at_1
1677
+ value: 30.567
1678
+ - type: map_at_10
1679
+ value: 45.01
1680
+ - type: map_at_100
1681
+ value: 46.091
1682
+ - type: map_at_1000
1683
+ value: 46.126
1684
+ - type: map_at_3
1685
+ value: 40.897
1686
+ - type: map_at_5
1687
+ value: 43.301
1688
+ - type: mrr_at_1
1689
+ value: 34.56
1690
+ - type: mrr_at_10
1691
+ value: 47.725
1692
+ - type: mrr_at_100
1693
+ value: 48.548
1694
+ - type: mrr_at_1000
1695
+ value: 48.571999999999996
1696
+ - type: mrr_at_3
1697
+ value: 44.361
1698
+ - type: mrr_at_5
1699
+ value: 46.351
1700
+ - type: ndcg_at_1
1701
+ value: 34.531
1702
+ - type: ndcg_at_10
1703
+ value: 52.410000000000004
1704
+ - type: ndcg_at_100
1705
+ value: 56.999
1706
+ - type: ndcg_at_1000
1707
+ value: 57.830999999999996
1708
+ - type: ndcg_at_3
1709
+ value: 44.734
1710
+ - type: ndcg_at_5
1711
+ value: 48.701
1712
+ - type: precision_at_1
1713
+ value: 34.531
1714
+ - type: precision_at_10
1715
+ value: 8.612
1716
+ - type: precision_at_100
1717
+ value: 1.118
1718
+ - type: precision_at_1000
1719
+ value: 0.12
1720
+ - type: precision_at_3
1721
+ value: 20.307
1722
+ - type: precision_at_5
1723
+ value: 14.519000000000002
1724
+ - type: recall_at_1
1725
+ value: 30.567
1726
+ - type: recall_at_10
1727
+ value: 72.238
1728
+ - type: recall_at_100
1729
+ value: 92.154
1730
+ - type: recall_at_1000
1731
+ value: 98.375
1732
+ - type: recall_at_3
1733
+ value: 52.437999999999995
1734
+ - type: recall_at_5
1735
+ value: 61.516999999999996
1736
+ - task:
1737
+ type: Retrieval
1738
+ dataset:
1739
+ type: quora
1740
+ name: MTEB QuoraRetrieval
1741
+ config: default
1742
+ split: test
1743
+ metrics:
1744
+ - type: map_at_1
1745
+ value: 65.98
1746
+ - type: map_at_10
1747
+ value: 80.05600000000001
1748
+ - type: map_at_100
1749
+ value: 80.76299999999999
1750
+ - type: map_at_1000
1751
+ value: 80.786
1752
+ - type: map_at_3
1753
+ value: 76.848
1754
+ - type: map_at_5
1755
+ value: 78.854
1756
+ - type: mrr_at_1
1757
+ value: 75.86
1758
+ - type: mrr_at_10
1759
+ value: 83.397
1760
+ - type: mrr_at_100
1761
+ value: 83.555
1762
+ - type: mrr_at_1000
1763
+ value: 83.557
1764
+ - type: mrr_at_3
1765
+ value: 82.033
1766
+ - type: mrr_at_5
1767
+ value: 82.97
1768
+ - type: ndcg_at_1
1769
+ value: 75.88000000000001
1770
+ - type: ndcg_at_10
1771
+ value: 84.58099999999999
1772
+ - type: ndcg_at_100
1773
+ value: 86.151
1774
+ - type: ndcg_at_1000
1775
+ value: 86.315
1776
+ - type: ndcg_at_3
1777
+ value: 80.902
1778
+ - type: ndcg_at_5
1779
+ value: 82.953
1780
+ - type: precision_at_1
1781
+ value: 75.88000000000001
1782
+ - type: precision_at_10
1783
+ value: 12.986
1784
+ - type: precision_at_100
1785
+ value: 1.5110000000000001
1786
+ - type: precision_at_1000
1787
+ value: 0.156
1788
+ - type: precision_at_3
1789
+ value: 35.382999999999996
1790
+ - type: precision_at_5
1791
+ value: 23.555999999999997
1792
+ - type: recall_at_1
1793
+ value: 65.98
1794
+ - type: recall_at_10
1795
+ value: 93.716
1796
+ - type: recall_at_100
1797
+ value: 99.21799999999999
1798
+ - type: recall_at_1000
1799
+ value: 99.97
1800
+ - type: recall_at_3
1801
+ value: 83.551
1802
+ - type: recall_at_5
1803
+ value: 88.998
1804
+ - task:
1805
+ type: Clustering
1806
+ dataset:
1807
+ type: mteb/reddit-clustering
1808
+ name: MTEB RedditClustering
1809
+ config: default
1810
+ split: test
1811
+ metrics:
1812
+ - type: v_measure
1813
+ value: 40.45148482612238
1814
+ - task:
1815
+ type: Clustering
1816
+ dataset:
1817
+ type: mteb/reddit-clustering-p2p
1818
+ name: MTEB RedditClusteringP2P
1819
+ config: default
1820
+ split: test
1821
+ metrics:
1822
+ - type: v_measure
1823
+ value: 55.749490673039126
1824
+ - task:
1825
+ type: Retrieval
1826
+ dataset:
1827
+ type: scidocs
1828
+ name: MTEB SCIDOCS
1829
+ config: default
1830
+ split: test
1831
+ metrics:
1832
+ - type: map_at_1
1833
+ value: 4.903
1834
+ - type: map_at_10
1835
+ value: 11.926
1836
+ - type: map_at_100
1837
+ value: 13.916999999999998
1838
+ - type: map_at_1000
1839
+ value: 14.215
1840
+ - type: map_at_3
1841
+ value: 8.799999999999999
1842
+ - type: map_at_5
1843
+ value: 10.360999999999999
1844
+ - type: mrr_at_1
1845
+ value: 24.099999999999998
1846
+ - type: mrr_at_10
1847
+ value: 34.482
1848
+ - type: mrr_at_100
1849
+ value: 35.565999999999995
1850
+ - type: mrr_at_1000
1851
+ value: 35.619
1852
+ - type: mrr_at_3
1853
+ value: 31.433
1854
+ - type: mrr_at_5
1855
+ value: 33.243
1856
+ - type: ndcg_at_1
1857
+ value: 24.099999999999998
1858
+ - type: ndcg_at_10
1859
+ value: 19.872999999999998
1860
+ - type: ndcg_at_100
1861
+ value: 27.606
1862
+ - type: ndcg_at_1000
1863
+ value: 32.811
1864
+ - type: ndcg_at_3
1865
+ value: 19.497999999999998
1866
+ - type: ndcg_at_5
1867
+ value: 16.813
1868
+ - type: precision_at_1
1869
+ value: 24.099999999999998
1870
+ - type: precision_at_10
1871
+ value: 10.08
1872
+ - type: precision_at_100
1873
+ value: 2.122
1874
+ - type: precision_at_1000
1875
+ value: 0.337
1876
+ - type: precision_at_3
1877
+ value: 18.2
1878
+ - type: precision_at_5
1879
+ value: 14.62
1880
+ - type: recall_at_1
1881
+ value: 4.903
1882
+ - type: recall_at_10
1883
+ value: 20.438000000000002
1884
+ - type: recall_at_100
1885
+ value: 43.043
1886
+ - type: recall_at_1000
1887
+ value: 68.41000000000001
1888
+ - type: recall_at_3
1889
+ value: 11.068
1890
+ - type: recall_at_5
1891
+ value: 14.818000000000001
1892
+ - task:
1893
+ type: STS
1894
+ dataset:
1895
+ type: mteb/sickr-sts
1896
+ name: MTEB SICK-R
1897
+ config: default
1898
+ split: test
1899
+ metrics:
1900
+ - type: cos_sim_pearson
1901
+ value: 78.58086597995997
1902
+ - type: cos_sim_spearman
1903
+ value: 69.63214182814991
1904
+ - type: euclidean_pearson
1905
+ value: 72.76175489042691
1906
+ - type: euclidean_spearman
1907
+ value: 67.84965161872971
1908
+ - type: manhattan_pearson
1909
+ value: 72.73812689782592
1910
+ - type: manhattan_spearman
1911
+ value: 67.83610439531277
1912
+ - task:
1913
+ type: STS
1914
+ dataset:
1915
+ type: mteb/sts12-sts
1916
+ name: MTEB STS12
1917
+ config: default
1918
+ split: test
1919
+ metrics:
1920
+ - type: cos_sim_pearson
1921
+ value: 75.13970861325006
1922
+ - type: cos_sim_spearman
1923
+ value: 67.5020551515597
1924
+ - type: euclidean_pearson
1925
+ value: 66.33415412418276
1926
+ - type: euclidean_spearman
1927
+ value: 66.82145056673268
1928
+ - type: manhattan_pearson
1929
+ value: 66.55489484006415
1930
+ - type: manhattan_spearman
1931
+ value: 66.95147433279057
1932
+ - task:
1933
+ type: STS
1934
+ dataset:
1935
+ type: mteb/sts13-sts
1936
+ name: MTEB STS13
1937
+ config: default
1938
+ split: test
1939
+ metrics:
1940
+ - type: cos_sim_pearson
1941
+ value: 78.85850536483447
1942
+ - type: cos_sim_spearman
1943
+ value: 79.1633350177206
1944
+ - type: euclidean_pearson
1945
+ value: 72.74090561408477
1946
+ - type: euclidean_spearman
1947
+ value: 73.57374448302961
1948
+ - type: manhattan_pearson
1949
+ value: 72.92980654233226
1950
+ - type: manhattan_spearman
1951
+ value: 73.72777155112588
1952
+ - task:
1953
+ type: STS
1954
+ dataset:
1955
+ type: mteb/sts14-sts
1956
+ name: MTEB STS14
1957
+ config: default
1958
+ split: test
1959
+ metrics:
1960
+ - type: cos_sim_pearson
1961
+ value: 79.51125593897028
1962
+ - type: cos_sim_spearman
1963
+ value: 74.46048326701329
1964
+ - type: euclidean_pearson
1965
+ value: 70.87726087052985
1966
+ - type: euclidean_spearman
1967
+ value: 67.7721470654411
1968
+ - type: manhattan_pearson
1969
+ value: 71.05892792135637
1970
+ - type: manhattan_spearman
1971
+ value: 67.93472619779037
1972
+ - task:
1973
+ type: STS
1974
+ dataset:
1975
+ type: mteb/sts15-sts
1976
+ name: MTEB STS15
1977
+ config: default
1978
+ split: test
1979
+ metrics:
1980
+ - type: cos_sim_pearson
1981
+ value: 83.8299348880489
1982
+ - type: cos_sim_spearman
1983
+ value: 84.47194637929275
1984
+ - type: euclidean_pearson
1985
+ value: 78.68768462480418
1986
+ - type: euclidean_spearman
1987
+ value: 79.80526323901917
1988
+ - type: manhattan_pearson
1989
+ value: 78.6810718151946
1990
+ - type: manhattan_spearman
1991
+ value: 79.7820584821254
1992
+ - task:
1993
+ type: STS
1994
+ dataset:
1995
+ type: mteb/sts16-sts
1996
+ name: MTEB STS16
1997
+ config: default
1998
+ split: test
1999
+ metrics:
2000
+ - type: cos_sim_pearson
2001
+ value: 79.99206664843005
2002
+ - type: cos_sim_spearman
2003
+ value: 80.96089203722137
2004
+ - type: euclidean_pearson
2005
+ value: 71.31216213716365
2006
+ - type: euclidean_spearman
2007
+ value: 71.45258140049407
2008
+ - type: manhattan_pearson
2009
+ value: 71.26140340402836
2010
+ - type: manhattan_spearman
2011
+ value: 71.3896894666943
2012
+ - task:
2013
+ type: STS
2014
+ dataset:
2015
+ type: mteb/sts17-crosslingual-sts
2016
+ name: MTEB STS17 (en-en)
2017
+ config: en-en
2018
+ split: test
2019
+ metrics:
2020
+ - type: cos_sim_pearson
2021
+ value: 87.35697089594868
2022
+ - type: cos_sim_spearman
2023
+ value: 87.78202647220289
2024
+ - type: euclidean_pearson
2025
+ value: 84.20969668786667
2026
+ - type: euclidean_spearman
2027
+ value: 83.91876425459982
2028
+ - type: manhattan_pearson
2029
+ value: 84.24429755612542
2030
+ - type: manhattan_spearman
2031
+ value: 83.98826315103398
2032
+ - task:
2033
+ type: STS
2034
+ dataset:
2035
+ type: mteb/sts22-crosslingual-sts
2036
+ name: MTEB STS22 (en)
2037
+ config: en
2038
+ split: test
2039
+ metrics:
2040
+ - type: cos_sim_pearson
2041
+ value: 69.06962775868384
2042
+ - type: cos_sim_spearman
2043
+ value: 69.34889515492327
2044
+ - type: euclidean_pearson
2045
+ value: 69.28108180412313
2046
+ - type: euclidean_spearman
2047
+ value: 69.6437114853659
2048
+ - type: manhattan_pearson
2049
+ value: 69.39974983734993
2050
+ - type: manhattan_spearman
2051
+ value: 69.69057284482079
2052
+ - task:
2053
+ type: STS
2054
+ dataset:
2055
+ type: mteb/stsbenchmark-sts
2056
+ name: MTEB STSBenchmark
2057
+ config: default
2058
+ split: test
2059
+ metrics:
2060
+ - type: cos_sim_pearson
2061
+ value: 82.42553734213958
2062
+ - type: cos_sim_spearman
2063
+ value: 81.38977341532744
2064
+ - type: euclidean_pearson
2065
+ value: 76.47494587945522
2066
+ - type: euclidean_spearman
2067
+ value: 75.92794860531089
2068
+ - type: manhattan_pearson
2069
+ value: 76.4768777169467
2070
+ - type: manhattan_spearman
2071
+ value: 75.9252673228599
2072
+ - task:
2073
+ type: Reranking
2074
+ dataset:
2075
+ type: mteb/scidocs-reranking
2076
+ name: MTEB SciDocsRR
2077
+ config: default
2078
+ split: test
2079
+ metrics:
2080
+ - type: map
2081
+ value: 80.78825425914722
2082
+ - type: mrr
2083
+ value: 94.60017197762296
2084
+ - task:
2085
+ type: Retrieval
2086
+ dataset:
2087
+ type: scifact
2088
+ name: MTEB SciFact
2089
+ config: default
2090
+ split: test
2091
+ metrics:
2092
+ - type: map_at_1
2093
+ value: 60.633
2094
+ - type: map_at_10
2095
+ value: 70.197
2096
+ - type: map_at_100
2097
+ value: 70.758
2098
+ - type: map_at_1000
2099
+ value: 70.765
2100
+ - type: map_at_3
2101
+ value: 67.082
2102
+ - type: map_at_5
2103
+ value: 69.209
2104
+ - type: mrr_at_1
2105
+ value: 63.333
2106
+ - type: mrr_at_10
2107
+ value: 71.17
2108
+ - type: mrr_at_100
2109
+ value: 71.626
2110
+ - type: mrr_at_1000
2111
+ value: 71.633
2112
+ - type: mrr_at_3
2113
+ value: 68.833
2114
+ - type: mrr_at_5
2115
+ value: 70.6
2116
+ - type: ndcg_at_1
2117
+ value: 63.333
2118
+ - type: ndcg_at_10
2119
+ value: 74.697
2120
+ - type: ndcg_at_100
2121
+ value: 76.986
2122
+ - type: ndcg_at_1000
2123
+ value: 77.225
2124
+ - type: ndcg_at_3
2125
+ value: 69.527
2126
+ - type: ndcg_at_5
2127
+ value: 72.816
2128
+ - type: precision_at_1
2129
+ value: 63.333
2130
+ - type: precision_at_10
2131
+ value: 9.9
2132
+ - type: precision_at_100
2133
+ value: 1.103
2134
+ - type: precision_at_1000
2135
+ value: 0.11199999999999999
2136
+ - type: precision_at_3
2137
+ value: 26.889000000000003
2138
+ - type: precision_at_5
2139
+ value: 18.2
2140
+ - type: recall_at_1
2141
+ value: 60.633
2142
+ - type: recall_at_10
2143
+ value: 87.36699999999999
2144
+ - type: recall_at_100
2145
+ value: 97.333
2146
+ - type: recall_at_1000
2147
+ value: 99.333
2148
+ - type: recall_at_3
2149
+ value: 73.656
2150
+ - type: recall_at_5
2151
+ value: 82.083
2152
+ - task:
2153
+ type: PairClassification
2154
+ dataset:
2155
+ type: mteb/sprintduplicatequestions-pairclassification
2156
+ name: MTEB SprintDuplicateQuestions
2157
+ config: default
2158
+ split: test
2159
+ metrics:
2160
+ - type: cos_sim_accuracy
2161
+ value: 99.76633663366337
2162
+ - type: cos_sim_ap
2163
+ value: 93.84024096781063
2164
+ - type: cos_sim_f1
2165
+ value: 88.08080808080808
2166
+ - type: cos_sim_precision
2167
+ value: 88.9795918367347
2168
+ - type: cos_sim_recall
2169
+ value: 87.2
2170
+ - type: dot_accuracy
2171
+ value: 99.46336633663367
2172
+ - type: dot_ap
2173
+ value: 75.78127156965245
2174
+ - type: dot_f1
2175
+ value: 71.41403865717193
2176
+ - type: dot_precision
2177
+ value: 72.67080745341616
2178
+ - type: dot_recall
2179
+ value: 70.19999999999999
2180
+ - type: euclidean_accuracy
2181
+ value: 99.67524752475248
2182
+ - type: euclidean_ap
2183
+ value: 88.61274955249769
2184
+ - type: euclidean_f1
2185
+ value: 82.30852211434735
2186
+ - type: euclidean_precision
2187
+ value: 89.34426229508196
2188
+ - type: euclidean_recall
2189
+ value: 76.3
2190
+ - type: manhattan_accuracy
2191
+ value: 99.67722772277227
2192
+ - type: manhattan_ap
2193
+ value: 88.77516158012779
2194
+ - type: manhattan_f1
2195
+ value: 82.36536430834212
2196
+ - type: manhattan_precision
2197
+ value: 87.24832214765101
2198
+ - type: manhattan_recall
2199
+ value: 78.0
2200
+ - type: max_accuracy
2201
+ value: 99.76633663366337
2202
+ - type: max_ap
2203
+ value: 93.84024096781063
2204
+ - type: max_f1
2205
+ value: 88.08080808080808
2206
+ - task:
2207
+ type: Clustering
2208
+ dataset:
2209
+ type: mteb/stackexchange-clustering
2210
+ name: MTEB StackExchangeClustering
2211
+ config: default
2212
+ split: test
2213
+ metrics:
2214
+ - type: v_measure
2215
+ value: 59.20812266121527
2216
+ - task:
2217
+ type: Clustering
2218
+ dataset:
2219
+ type: mteb/stackexchange-clustering-p2p
2220
+ name: MTEB StackExchangeClusteringP2P
2221
+ config: default
2222
+ split: test
2223
+ metrics:
2224
+ - type: v_measure
2225
+ value: 33.954248554638056
2226
+ - task:
2227
+ type: Reranking
2228
+ dataset:
2229
+ type: mteb/stackoverflowdupquestions-reranking
2230
+ name: MTEB StackOverflowDupQuestions
2231
+ config: default
2232
+ split: test
2233
+ metrics:
2234
+ - type: map
2235
+ value: 51.52800990025549
2236
+ - type: mrr
2237
+ value: 52.360394915541974
2238
+ - task:
2239
+ type: Summarization
2240
+ dataset:
2241
+ type: mteb/summeval
2242
+ name: MTEB SummEval
2243
+ config: default
2244
+ split: test
2245
+ metrics:
2246
+ - type: cos_sim_pearson
2247
+ value: 24.57438758817976
2248
+ - type: cos_sim_spearman
2249
+ value: 24.747448399760643
2250
+ - type: dot_pearson
2251
+ value: 26.589017584184987
2252
+ - type: dot_spearman
2253
+ value: 25.653620812462783
2254
+ - task:
2255
+ type: Retrieval
2256
+ dataset:
2257
+ type: trec-covid
2258
+ name: MTEB TRECCOVID
2259
+ config: default
2260
+ split: test
2261
+ metrics:
2262
+ - type: map_at_1
2263
+ value: 0.253
2264
+ - type: map_at_10
2265
+ value: 2.1399999999999997
2266
+ - type: map_at_100
2267
+ value: 12.873000000000001
2268
+ - type: map_at_1000
2269
+ value: 31.002000000000002
2270
+ - type: map_at_3
2271
+ value: 0.711
2272
+ - type: map_at_5
2273
+ value: 1.125
2274
+ - type: mrr_at_1
2275
+ value: 96.0
2276
+ - type: mrr_at_10
2277
+ value: 98.0
2278
+ - type: mrr_at_100
2279
+ value: 98.0
2280
+ - type: mrr_at_1000
2281
+ value: 98.0
2282
+ - type: mrr_at_3
2283
+ value: 98.0
2284
+ - type: mrr_at_5
2285
+ value: 98.0
2286
+ - type: ndcg_at_1
2287
+ value: 94.0
2288
+ - type: ndcg_at_10
2289
+ value: 84.881
2290
+ - type: ndcg_at_100
2291
+ value: 64.694
2292
+ - type: ndcg_at_1000
2293
+ value: 56.85
2294
+ - type: ndcg_at_3
2295
+ value: 90.061
2296
+ - type: ndcg_at_5
2297
+ value: 87.155
2298
+ - type: precision_at_1
2299
+ value: 96.0
2300
+ - type: precision_at_10
2301
+ value: 88.8
2302
+ - type: precision_at_100
2303
+ value: 65.7
2304
+ - type: precision_at_1000
2305
+ value: 25.080000000000002
2306
+ - type: precision_at_3
2307
+ value: 92.667
2308
+ - type: precision_at_5
2309
+ value: 90.0
2310
+ - type: recall_at_1
2311
+ value: 0.253
2312
+ - type: recall_at_10
2313
+ value: 2.292
2314
+ - type: recall_at_100
2315
+ value: 15.78
2316
+ - type: recall_at_1000
2317
+ value: 53.015
2318
+ - type: recall_at_3
2319
+ value: 0.7270000000000001
2320
+ - type: recall_at_5
2321
+ value: 1.162
2322
+ - task:
2323
+ type: Retrieval
2324
+ dataset:
2325
+ type: webis-touche2020
2326
+ name: MTEB Touche2020
2327
+ config: default
2328
+ split: test
2329
+ metrics:
2330
+ - type: map_at_1
2331
+ value: 2.116
2332
+ - type: map_at_10
2333
+ value: 9.625
2334
+ - type: map_at_100
2335
+ value: 15.641
2336
+ - type: map_at_1000
2337
+ value: 17.127
2338
+ - type: map_at_3
2339
+ value: 4.316
2340
+ - type: map_at_5
2341
+ value: 6.208
2342
+ - type: mrr_at_1
2343
+ value: 32.653
2344
+ - type: mrr_at_10
2345
+ value: 48.083999999999996
2346
+ - type: mrr_at_100
2347
+ value: 48.631
2348
+ - type: mrr_at_1000
2349
+ value: 48.649
2350
+ - type: mrr_at_3
2351
+ value: 42.857
2352
+ - type: mrr_at_5
2353
+ value: 46.224
2354
+ - type: ndcg_at_1
2355
+ value: 29.592000000000002
2356
+ - type: ndcg_at_10
2357
+ value: 25.430999999999997
2358
+ - type: ndcg_at_100
2359
+ value: 36.344
2360
+ - type: ndcg_at_1000
2361
+ value: 47.676
2362
+ - type: ndcg_at_3
2363
+ value: 26.144000000000002
2364
+ - type: ndcg_at_5
2365
+ value: 26.304
2366
+ - type: precision_at_1
2367
+ value: 32.653
2368
+ - type: precision_at_10
2369
+ value: 24.082
2370
+ - type: precision_at_100
2371
+ value: 7.714
2372
+ - type: precision_at_1000
2373
+ value: 1.5310000000000001
2374
+ - type: precision_at_3
2375
+ value: 26.531
2376
+ - type: precision_at_5
2377
+ value: 26.939
2378
+ - type: recall_at_1
2379
+ value: 2.116
2380
+ - type: recall_at_10
2381
+ value: 16.794
2382
+ - type: recall_at_100
2383
+ value: 47.452
2384
+ - type: recall_at_1000
2385
+ value: 82.312
2386
+ - type: recall_at_3
2387
+ value: 5.306
2388
+ - type: recall_at_5
2389
+ value: 9.306000000000001
2390
+ - task:
2391
+ type: Classification
2392
+ dataset:
2393
+ type: mteb/toxic_conversations_50k
2394
+ name: MTEB ToxicConversationsClassification
2395
+ config: default
2396
+ split: test
2397
+ metrics:
2398
+ - type: accuracy
2399
+ value: 67.709
2400
+ - type: ap
2401
+ value: 13.541535578501716
2402
+ - type: f1
2403
+ value: 52.569619919446794
2404
+ - task:
2405
+ type: Classification
2406
+ dataset:
2407
+ type: mteb/tweet_sentiment_extraction
2408
+ name: MTEB TweetSentimentExtractionClassification
2409
+ config: default
2410
+ split: test
2411
+ metrics:
2412
+ - type: accuracy
2413
+ value: 56.850594227504246
2414
+ - type: f1
2415
+ value: 57.233377364910574
2416
+ - task:
2417
+ type: Clustering
2418
+ dataset:
2419
+ type: mteb/twentynewsgroups-clustering
2420
+ name: MTEB TwentyNewsgroupsClustering
2421
+ config: default
2422
+ split: test
2423
+ metrics:
2424
+ - type: v_measure
2425
+ value: 39.463722986090474
2426
+ - task:
2427
+ type: PairClassification
2428
+ dataset:
2429
+ type: mteb/twittersemeval2015-pairclassification
2430
+ name: MTEB TwitterSemEval2015
2431
+ config: default
2432
+ split: test
2433
+ metrics:
2434
+ - type: cos_sim_accuracy
2435
+ value: 84.09131549144662
2436
+ - type: cos_sim_ap
2437
+ value: 66.86677647503386
2438
+ - type: cos_sim_f1
2439
+ value: 62.94631710362049
2440
+ - type: cos_sim_precision
2441
+ value: 59.73933649289099
2442
+ - type: cos_sim_recall
2443
+ value: 66.51715039577837
2444
+ - type: dot_accuracy
2445
+ value: 80.27656911247541
2446
+ - type: dot_ap
2447
+ value: 54.291720398612085
2448
+ - type: dot_f1
2449
+ value: 54.77150537634409
2450
+ - type: dot_precision
2451
+ value: 47.58660957571039
2452
+ - type: dot_recall
2453
+ value: 64.5118733509235
2454
+ - type: euclidean_accuracy
2455
+ value: 82.76211480002385
2456
+ - type: euclidean_ap
2457
+ value: 62.430397690753296
2458
+ - type: euclidean_f1
2459
+ value: 59.191590539356774
2460
+ - type: euclidean_precision
2461
+ value: 56.296119971435374
2462
+ - type: euclidean_recall
2463
+ value: 62.401055408970976
2464
+ - type: manhattan_accuracy
2465
+ value: 82.7561542588067
2466
+ - type: manhattan_ap
2467
+ value: 62.41882051995577
2468
+ - type: manhattan_f1
2469
+ value: 59.32101002778785
2470
+ - type: manhattan_precision
2471
+ value: 54.71361711611321
2472
+ - type: manhattan_recall
2473
+ value: 64.77572559366754
2474
+ - type: max_accuracy
2475
+ value: 84.09131549144662
2476
+ - type: max_ap
2477
+ value: 66.86677647503386
2478
+ - type: max_f1
2479
+ value: 62.94631710362049
2480
+ - task:
2481
+ type: PairClassification
2482
+ dataset:
2483
+ type: mteb/twitterurlcorpus-pairclassification
2484
+ name: MTEB TwitterURLCorpus
2485
+ config: default
2486
+ split: test
2487
+ metrics:
2488
+ - type: cos_sim_accuracy
2489
+ value: 88.79574649745798
2490
+ - type: cos_sim_ap
2491
+ value: 85.28960532524223
2492
+ - type: cos_sim_f1
2493
+ value: 77.98460043358001
2494
+ - type: cos_sim_precision
2495
+ value: 75.78090948714224
2496
+ - type: cos_sim_recall
2497
+ value: 80.32029565753002
2498
+ - type: dot_accuracy
2499
+ value: 85.5939767920208
2500
+ - type: dot_ap
2501
+ value: 76.14131706694056
2502
+ - type: dot_f1
2503
+ value: 72.70246298696868
2504
+ - type: dot_precision
2505
+ value: 65.27012127894156
2506
+ - type: dot_recall
2507
+ value: 82.04496458269172
2508
+ - type: euclidean_accuracy
2509
+ value: 86.72332828812046
2510
+ - type: euclidean_ap
2511
+ value: 80.84854809178995
2512
+ - type: euclidean_f1
2513
+ value: 72.47657499809551
2514
+ - type: euclidean_precision
2515
+ value: 71.71717171717171
2516
+ - type: euclidean_recall
2517
+ value: 73.25223283030489
2518
+ - type: manhattan_accuracy
2519
+ value: 86.7563162184189
2520
+ - type: manhattan_ap
2521
+ value: 80.87598895575626
2522
+ - type: manhattan_f1
2523
+ value: 72.54617892068092
2524
+ - type: manhattan_precision
2525
+ value: 68.49268225960881
2526
+ - type: manhattan_recall
2527
+ value: 77.10963966738528
2528
+ - type: max_accuracy
2529
+ value: 88.79574649745798
2530
+ - type: max_ap
2531
+ value: 85.28960532524223
2532
+ - type: max_f1
2533
+ value: 77.98460043358001
2534
  ---
2535
 
2536
  # SGPT-5.8B-weightedmean-msmarco-specb-bitfit