andersonbcdefg commited on
Commit
b7d720b
1 Parent(s): 9e4d032

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2599 -1
README.md CHANGED
@@ -5,12 +5,2610 @@ tags:
5
  - feature-extraction
6
  - sentence-similarity
7
  - transformers
 
8
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9
  ---
10
 
11
- # {MODEL_NAME}
12
 
13
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
 
 
14
 
15
  <!--- Describe your model here -->
16
 
 
5
  - feature-extraction
6
  - sentence-similarity
7
  - transformers
8
+ - mteb
9
 
10
+ model-index:
11
+ - name: bge_micro
12
+ results:
13
+ - task:
14
+ type: Classification
15
+ dataset:
16
+ type: mteb/amazon_counterfactual
17
+ name: MTEB AmazonCounterfactualClassification (en)
18
+ config: en
19
+ split: test
20
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
21
+ metrics:
22
+ - type: accuracy
23
+ value: 66.26865671641792
24
+ - type: ap
25
+ value: 28.174006539079688
26
+ - type: f1
27
+ value: 59.724963358211035
28
+ - task:
29
+ type: Classification
30
+ dataset:
31
+ type: mteb/amazon_polarity
32
+ name: MTEB AmazonPolarityClassification
33
+ config: default
34
+ split: test
35
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
36
+ metrics:
37
+ - type: accuracy
38
+ value: 75.3691
39
+ - type: ap
40
+ value: 69.64182876373573
41
+ - type: f1
42
+ value: 75.2906345000088
43
+ - task:
44
+ type: Classification
45
+ dataset:
46
+ type: mteb/amazon_reviews_multi
47
+ name: MTEB AmazonReviewsClassification (en)
48
+ config: en
49
+ split: test
50
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
51
+ metrics:
52
+ - type: accuracy
53
+ value: 35.806
54
+ - type: f1
55
+ value: 35.506516495961904
56
+ - task:
57
+ type: Retrieval
58
+ dataset:
59
+ type: arguana
60
+ name: MTEB ArguAna
61
+ config: default
62
+ split: test
63
+ revision: None
64
+ metrics:
65
+ - type: map_at_1
66
+ value: 27.24
67
+ - type: map_at_10
68
+ value: 42.832
69
+ - type: map_at_100
70
+ value: 43.797000000000004
71
+ - type: map_at_1000
72
+ value: 43.804
73
+ - type: map_at_3
74
+ value: 38.134
75
+ - type: map_at_5
76
+ value: 40.744
77
+ - type: mrr_at_1
78
+ value: 27.951999999999998
79
+ - type: mrr_at_10
80
+ value: 43.111
81
+ - type: mrr_at_100
82
+ value: 44.083
83
+ - type: mrr_at_1000
84
+ value: 44.09
85
+ - type: mrr_at_3
86
+ value: 38.431
87
+ - type: mrr_at_5
88
+ value: 41.019
89
+ - type: ndcg_at_1
90
+ value: 27.24
91
+ - type: ndcg_at_10
92
+ value: 51.513
93
+ - type: ndcg_at_100
94
+ value: 55.762
95
+ - type: ndcg_at_1000
96
+ value: 55.938
97
+ - type: ndcg_at_3
98
+ value: 41.743
99
+ - type: ndcg_at_5
100
+ value: 46.454
101
+ - type: precision_at_1
102
+ value: 27.24
103
+ - type: precision_at_10
104
+ value: 7.93
105
+ - type: precision_at_100
106
+ value: 0.9820000000000001
107
+ - type: precision_at_1000
108
+ value: 0.1
109
+ - type: precision_at_3
110
+ value: 17.402
111
+ - type: precision_at_5
112
+ value: 12.731
113
+ - type: recall_at_1
114
+ value: 27.24
115
+ - type: recall_at_10
116
+ value: 79.303
117
+ - type: recall_at_100
118
+ value: 98.151
119
+ - type: recall_at_1000
120
+ value: 99.502
121
+ - type: recall_at_3
122
+ value: 52.205
123
+ - type: recall_at_5
124
+ value: 63.656
125
+ - task:
126
+ type: Clustering
127
+ dataset:
128
+ type: mteb/arxiv-clustering-p2p
129
+ name: MTEB ArxivClusteringP2P
130
+ config: default
131
+ split: test
132
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
133
+ metrics:
134
+ - type: v_measure
135
+ value: 44.59766397469585
136
+ - task:
137
+ type: Clustering
138
+ dataset:
139
+ type: mteb/arxiv-clustering-s2s
140
+ name: MTEB ArxivClusteringS2S
141
+ config: default
142
+ split: test
143
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
144
+ metrics:
145
+ - type: v_measure
146
+ value: 34.480143023109626
147
+ - task:
148
+ type: Reranking
149
+ dataset:
150
+ type: mteb/askubuntudupquestions-reranking
151
+ name: MTEB AskUbuntuDupQuestions
152
+ config: default
153
+ split: test
154
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
155
+ metrics:
156
+ - type: map
157
+ value: 58.09326229984527
158
+ - type: mrr
159
+ value: 72.18429846546191
160
+ - task:
161
+ type: STS
162
+ dataset:
163
+ type: mteb/biosses-sts
164
+ name: MTEB BIOSSES
165
+ config: default
166
+ split: test
167
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
168
+ metrics:
169
+ - type: cos_sim_pearson
170
+ value: 85.47582391622187
171
+ - type: cos_sim_spearman
172
+ value: 83.41635852964214
173
+ - type: euclidean_pearson
174
+ value: 84.21969728559216
175
+ - type: euclidean_spearman
176
+ value: 83.46575724558684
177
+ - type: manhattan_pearson
178
+ value: 83.83107014910223
179
+ - type: manhattan_spearman
180
+ value: 83.13321954800792
181
+ - task:
182
+ type: Classification
183
+ dataset:
184
+ type: mteb/banking77
185
+ name: MTEB Banking77Classification
186
+ config: default
187
+ split: test
188
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
189
+ metrics:
190
+ - type: accuracy
191
+ value: 80.58116883116882
192
+ - type: f1
193
+ value: 80.53335622619781
194
+ - task:
195
+ type: Clustering
196
+ dataset:
197
+ type: mteb/biorxiv-clustering-p2p
198
+ name: MTEB BiorxivClusteringP2P
199
+ config: default
200
+ split: test
201
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
202
+ metrics:
203
+ - type: v_measure
204
+ value: 37.13458676004344
205
+ - task:
206
+ type: Clustering
207
+ dataset:
208
+ type: mteb/biorxiv-clustering-s2s
209
+ name: MTEB BiorxivClusteringS2S
210
+ config: default
211
+ split: test
212
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
213
+ metrics:
214
+ - type: v_measure
215
+ value: 29.720429607514898
216
+ - task:
217
+ type: Retrieval
218
+ dataset:
219
+ type: BeIR/cqadupstack
220
+ name: MTEB CQADupstackAndroidRetrieval
221
+ config: default
222
+ split: test
223
+ revision: None
224
+ metrics:
225
+ - type: map_at_1
226
+ value: 26.051000000000002
227
+ - type: map_at_10
228
+ value: 36.291000000000004
229
+ - type: map_at_100
230
+ value: 37.632
231
+ - type: map_at_1000
232
+ value: 37.772
233
+ - type: map_at_3
234
+ value: 33.288000000000004
235
+ - type: map_at_5
236
+ value: 35.035
237
+ - type: mrr_at_1
238
+ value: 33.333
239
+ - type: mrr_at_10
240
+ value: 42.642
241
+ - type: mrr_at_100
242
+ value: 43.401
243
+ - type: mrr_at_1000
244
+ value: 43.463
245
+ - type: mrr_at_3
246
+ value: 40.272000000000006
247
+ - type: mrr_at_5
248
+ value: 41.753
249
+ - type: ndcg_at_1
250
+ value: 33.333
251
+ - type: ndcg_at_10
252
+ value: 42.291000000000004
253
+ - type: ndcg_at_100
254
+ value: 47.602
255
+ - type: ndcg_at_1000
256
+ value: 50.109
257
+ - type: ndcg_at_3
258
+ value: 38.033
259
+ - type: ndcg_at_5
260
+ value: 40.052
261
+ - type: precision_at_1
262
+ value: 33.333
263
+ - type: precision_at_10
264
+ value: 8.254999999999999
265
+ - type: precision_at_100
266
+ value: 1.353
267
+ - type: precision_at_1000
268
+ value: 0.185
269
+ - type: precision_at_3
270
+ value: 18.884
271
+ - type: precision_at_5
272
+ value: 13.447999999999999
273
+ - type: recall_at_1
274
+ value: 26.051000000000002
275
+ - type: recall_at_10
276
+ value: 53.107000000000006
277
+ - type: recall_at_100
278
+ value: 76.22
279
+ - type: recall_at_1000
280
+ value: 92.92399999999999
281
+ - type: recall_at_3
282
+ value: 40.073
283
+ - type: recall_at_5
284
+ value: 46.327
285
+ - task:
286
+ type: Retrieval
287
+ dataset:
288
+ type: BeIR/cqadupstack
289
+ name: MTEB CQADupstackEnglishRetrieval
290
+ config: default
291
+ split: test
292
+ revision: None
293
+ metrics:
294
+ - type: map_at_1
295
+ value: 19.698999999999998
296
+ - type: map_at_10
297
+ value: 26.186
298
+ - type: map_at_100
299
+ value: 27.133000000000003
300
+ - type: map_at_1000
301
+ value: 27.256999999999998
302
+ - type: map_at_3
303
+ value: 24.264
304
+ - type: map_at_5
305
+ value: 25.307000000000002
306
+ - type: mrr_at_1
307
+ value: 24.712999999999997
308
+ - type: mrr_at_10
309
+ value: 30.703999999999997
310
+ - type: mrr_at_100
311
+ value: 31.445
312
+ - type: mrr_at_1000
313
+ value: 31.517
314
+ - type: mrr_at_3
315
+ value: 28.992
316
+ - type: mrr_at_5
317
+ value: 29.963
318
+ - type: ndcg_at_1
319
+ value: 24.712999999999997
320
+ - type: ndcg_at_10
321
+ value: 30.198000000000004
322
+ - type: ndcg_at_100
323
+ value: 34.412
324
+ - type: ndcg_at_1000
325
+ value: 37.174
326
+ - type: ndcg_at_3
327
+ value: 27.148
328
+ - type: ndcg_at_5
329
+ value: 28.464
330
+ - type: precision_at_1
331
+ value: 24.712999999999997
332
+ - type: precision_at_10
333
+ value: 5.489999999999999
334
+ - type: precision_at_100
335
+ value: 0.955
336
+ - type: precision_at_1000
337
+ value: 0.14400000000000002
338
+ - type: precision_at_3
339
+ value: 12.803
340
+ - type: precision_at_5
341
+ value: 8.981
342
+ - type: recall_at_1
343
+ value: 19.698999999999998
344
+ - type: recall_at_10
345
+ value: 37.595
346
+ - type: recall_at_100
347
+ value: 55.962
348
+ - type: recall_at_1000
349
+ value: 74.836
350
+ - type: recall_at_3
351
+ value: 28.538999999999998
352
+ - type: recall_at_5
353
+ value: 32.279
354
+ - task:
355
+ type: Retrieval
356
+ dataset:
357
+ type: BeIR/cqadupstack
358
+ name: MTEB CQADupstackGamingRetrieval
359
+ config: default
360
+ split: test
361
+ revision: None
362
+ metrics:
363
+ - type: map_at_1
364
+ value: 34.224
365
+ - type: map_at_10
366
+ value: 44.867000000000004
367
+ - type: map_at_100
368
+ value: 45.944
369
+ - type: map_at_1000
370
+ value: 46.013999999999996
371
+ - type: map_at_3
372
+ value: 42.009
373
+ - type: map_at_5
374
+ value: 43.684
375
+ - type: mrr_at_1
376
+ value: 39.436
377
+ - type: mrr_at_10
378
+ value: 48.301
379
+ - type: mrr_at_100
380
+ value: 49.055
381
+ - type: mrr_at_1000
382
+ value: 49.099
383
+ - type: mrr_at_3
384
+ value: 45.956
385
+ - type: mrr_at_5
386
+ value: 47.445
387
+ - type: ndcg_at_1
388
+ value: 39.436
389
+ - type: ndcg_at_10
390
+ value: 50.214000000000006
391
+ - type: ndcg_at_100
392
+ value: 54.63
393
+ - type: ndcg_at_1000
394
+ value: 56.165
395
+ - type: ndcg_at_3
396
+ value: 45.272
397
+ - type: ndcg_at_5
398
+ value: 47.826
399
+ - type: precision_at_1
400
+ value: 39.436
401
+ - type: precision_at_10
402
+ value: 8.037999999999998
403
+ - type: precision_at_100
404
+ value: 1.118
405
+ - type: precision_at_1000
406
+ value: 0.13
407
+ - type: precision_at_3
408
+ value: 20.125
409
+ - type: precision_at_5
410
+ value: 13.918
411
+ - type: recall_at_1
412
+ value: 34.224
413
+ - type: recall_at_10
414
+ value: 62.690999999999995
415
+ - type: recall_at_100
416
+ value: 81.951
417
+ - type: recall_at_1000
418
+ value: 92.93299999999999
419
+ - type: recall_at_3
420
+ value: 49.299
421
+ - type: recall_at_5
422
+ value: 55.533
423
+ - task:
424
+ type: Retrieval
425
+ dataset:
426
+ type: BeIR/cqadupstack
427
+ name: MTEB CQADupstackGisRetrieval
428
+ config: default
429
+ split: test
430
+ revision: None
431
+ metrics:
432
+ - type: map_at_1
433
+ value: 21.375
434
+ - type: map_at_10
435
+ value: 28.366000000000003
436
+ - type: map_at_100
437
+ value: 29.363
438
+ - type: map_at_1000
439
+ value: 29.458000000000002
440
+ - type: map_at_3
441
+ value: 26.247
442
+ - type: map_at_5
443
+ value: 27.439000000000004
444
+ - type: mrr_at_1
445
+ value: 22.938
446
+ - type: mrr_at_10
447
+ value: 30.072
448
+ - type: mrr_at_100
449
+ value: 30.993
450
+ - type: mrr_at_1000
451
+ value: 31.070999999999998
452
+ - type: mrr_at_3
453
+ value: 28.004
454
+ - type: mrr_at_5
455
+ value: 29.179
456
+ - type: ndcg_at_1
457
+ value: 22.938
458
+ - type: ndcg_at_10
459
+ value: 32.516
460
+ - type: ndcg_at_100
461
+ value: 37.641999999999996
462
+ - type: ndcg_at_1000
463
+ value: 40.150999999999996
464
+ - type: ndcg_at_3
465
+ value: 28.341
466
+ - type: ndcg_at_5
467
+ value: 30.394
468
+ - type: precision_at_1
469
+ value: 22.938
470
+ - type: precision_at_10
471
+ value: 5.028
472
+ - type: precision_at_100
473
+ value: 0.8
474
+ - type: precision_at_1000
475
+ value: 0.105
476
+ - type: precision_at_3
477
+ value: 12.052999999999999
478
+ - type: precision_at_5
479
+ value: 8.497
480
+ - type: recall_at_1
481
+ value: 21.375
482
+ - type: recall_at_10
483
+ value: 43.682
484
+ - type: recall_at_100
485
+ value: 67.619
486
+ - type: recall_at_1000
487
+ value: 86.64699999999999
488
+ - type: recall_at_3
489
+ value: 32.478
490
+ - type: recall_at_5
491
+ value: 37.347
492
+ - task:
493
+ type: Retrieval
494
+ dataset:
495
+ type: BeIR/cqadupstack
496
+ name: MTEB CQADupstackMathematicaRetrieval
497
+ config: default
498
+ split: test
499
+ revision: None
500
+ metrics:
501
+ - type: map_at_1
502
+ value: 14.95
503
+ - type: map_at_10
504
+ value: 21.417
505
+ - type: map_at_100
506
+ value: 22.525000000000002
507
+ - type: map_at_1000
508
+ value: 22.665
509
+ - type: map_at_3
510
+ value: 18.684
511
+ - type: map_at_5
512
+ value: 20.275000000000002
513
+ - type: mrr_at_1
514
+ value: 18.159
515
+ - type: mrr_at_10
516
+ value: 25.373
517
+ - type: mrr_at_100
518
+ value: 26.348
519
+ - type: mrr_at_1000
520
+ value: 26.432
521
+ - type: mrr_at_3
522
+ value: 22.698999999999998
523
+ - type: mrr_at_5
524
+ value: 24.254
525
+ - type: ndcg_at_1
526
+ value: 18.159
527
+ - type: ndcg_at_10
528
+ value: 26.043
529
+ - type: ndcg_at_100
530
+ value: 31.491999999999997
531
+ - type: ndcg_at_1000
532
+ value: 34.818
533
+ - type: ndcg_at_3
534
+ value: 21.05
535
+ - type: ndcg_at_5
536
+ value: 23.580000000000002
537
+ - type: precision_at_1
538
+ value: 18.159
539
+ - type: precision_at_10
540
+ value: 4.938
541
+ - type: precision_at_100
542
+ value: 0.872
543
+ - type: precision_at_1000
544
+ value: 0.129
545
+ - type: precision_at_3
546
+ value: 9.908999999999999
547
+ - type: precision_at_5
548
+ value: 7.611999999999999
549
+ - type: recall_at_1
550
+ value: 14.95
551
+ - type: recall_at_10
552
+ value: 36.285000000000004
553
+ - type: recall_at_100
554
+ value: 60.431999999999995
555
+ - type: recall_at_1000
556
+ value: 84.208
557
+ - type: recall_at_3
558
+ value: 23.006
559
+ - type: recall_at_5
560
+ value: 29.304999999999996
561
+ - task:
562
+ type: Retrieval
563
+ dataset:
564
+ type: BeIR/cqadupstack
565
+ name: MTEB CQADupstackPhysicsRetrieval
566
+ config: default
567
+ split: test
568
+ revision: None
569
+ metrics:
570
+ - type: map_at_1
571
+ value: 23.580000000000002
572
+ - type: map_at_10
573
+ value: 32.906
574
+ - type: map_at_100
575
+ value: 34.222
576
+ - type: map_at_1000
577
+ value: 34.346
578
+ - type: map_at_3
579
+ value: 29.891000000000002
580
+ - type: map_at_5
581
+ value: 31.679000000000002
582
+ - type: mrr_at_1
583
+ value: 28.778
584
+ - type: mrr_at_10
585
+ value: 37.783
586
+ - type: mrr_at_100
587
+ value: 38.746
588
+ - type: mrr_at_1000
589
+ value: 38.804
590
+ - type: mrr_at_3
591
+ value: 35.098
592
+ - type: mrr_at_5
593
+ value: 36.739
594
+ - type: ndcg_at_1
595
+ value: 28.778
596
+ - type: ndcg_at_10
597
+ value: 38.484
598
+ - type: ndcg_at_100
599
+ value: 44.322
600
+ - type: ndcg_at_1000
601
+ value: 46.772000000000006
602
+ - type: ndcg_at_3
603
+ value: 33.586
604
+ - type: ndcg_at_5
605
+ value: 36.098
606
+ - type: precision_at_1
607
+ value: 28.778
608
+ - type: precision_at_10
609
+ value: 7.151000000000001
610
+ - type: precision_at_100
611
+ value: 1.185
612
+ - type: precision_at_1000
613
+ value: 0.158
614
+ - type: precision_at_3
615
+ value: 16.105
616
+ - type: precision_at_5
617
+ value: 11.704
618
+ - type: recall_at_1
619
+ value: 23.580000000000002
620
+ - type: recall_at_10
621
+ value: 50.151999999999994
622
+ - type: recall_at_100
623
+ value: 75.114
624
+ - type: recall_at_1000
625
+ value: 91.467
626
+ - type: recall_at_3
627
+ value: 36.552
628
+ - type: recall_at_5
629
+ value: 43.014
630
+ - task:
631
+ type: Retrieval
632
+ dataset:
633
+ type: BeIR/cqadupstack
634
+ name: MTEB CQADupstackProgrammersRetrieval
635
+ config: default
636
+ split: test
637
+ revision: None
638
+ metrics:
639
+ - type: map_at_1
640
+ value: 20.669999999999998
641
+ - type: map_at_10
642
+ value: 28.687
643
+ - type: map_at_100
644
+ value: 30.061
645
+ - type: map_at_1000
646
+ value: 30.197000000000003
647
+ - type: map_at_3
648
+ value: 26.134
649
+ - type: map_at_5
650
+ value: 27.508
651
+ - type: mrr_at_1
652
+ value: 26.256
653
+ - type: mrr_at_10
654
+ value: 34.105999999999995
655
+ - type: mrr_at_100
656
+ value: 35.137
657
+ - type: mrr_at_1000
658
+ value: 35.214
659
+ - type: mrr_at_3
660
+ value: 31.791999999999998
661
+ - type: mrr_at_5
662
+ value: 33.145
663
+ - type: ndcg_at_1
664
+ value: 26.256
665
+ - type: ndcg_at_10
666
+ value: 33.68
667
+ - type: ndcg_at_100
668
+ value: 39.7
669
+ - type: ndcg_at_1000
670
+ value: 42.625
671
+ - type: ndcg_at_3
672
+ value: 29.457
673
+ - type: ndcg_at_5
674
+ value: 31.355
675
+ - type: precision_at_1
676
+ value: 26.256
677
+ - type: precision_at_10
678
+ value: 6.2330000000000005
679
+ - type: precision_at_100
680
+ value: 1.08
681
+ - type: precision_at_1000
682
+ value: 0.149
683
+ - type: precision_at_3
684
+ value: 14.193
685
+ - type: precision_at_5
686
+ value: 10.113999999999999
687
+ - type: recall_at_1
688
+ value: 20.669999999999998
689
+ - type: recall_at_10
690
+ value: 43.254999999999995
691
+ - type: recall_at_100
692
+ value: 69.118
693
+ - type: recall_at_1000
694
+ value: 89.408
695
+ - type: recall_at_3
696
+ value: 31.135
697
+ - type: recall_at_5
698
+ value: 36.574
699
+ - task:
700
+ type: Retrieval
701
+ dataset:
702
+ type: BeIR/cqadupstack
703
+ name: MTEB CQADupstackRetrieval
704
+ config: default
705
+ split: test
706
+ revision: None
707
+ metrics:
708
+ - type: map_at_1
709
+ value: 21.488833333333336
710
+ - type: map_at_10
711
+ value: 29.025416666666665
712
+ - type: map_at_100
713
+ value: 30.141249999999992
714
+ - type: map_at_1000
715
+ value: 30.264083333333335
716
+ - type: map_at_3
717
+ value: 26.599333333333337
718
+ - type: map_at_5
719
+ value: 28.004666666666665
720
+ - type: mrr_at_1
721
+ value: 25.515
722
+ - type: mrr_at_10
723
+ value: 32.8235
724
+ - type: mrr_at_100
725
+ value: 33.69958333333333
726
+ - type: mrr_at_1000
727
+ value: 33.77191666666668
728
+ - type: mrr_at_3
729
+ value: 30.581000000000003
730
+ - type: mrr_at_5
731
+ value: 31.919666666666668
732
+ - type: ndcg_at_1
733
+ value: 25.515
734
+ - type: ndcg_at_10
735
+ value: 33.64241666666666
736
+ - type: ndcg_at_100
737
+ value: 38.75816666666667
738
+ - type: ndcg_at_1000
739
+ value: 41.472166666666666
740
+ - type: ndcg_at_3
741
+ value: 29.435083333333335
742
+ - type: ndcg_at_5
743
+ value: 31.519083333333338
744
+ - type: precision_at_1
745
+ value: 25.515
746
+ - type: precision_at_10
747
+ value: 5.89725
748
+ - type: precision_at_100
749
+ value: 0.9918333333333335
750
+ - type: precision_at_1000
751
+ value: 0.14075
752
+ - type: precision_at_3
753
+ value: 13.504000000000001
754
+ - type: precision_at_5
755
+ value: 9.6885
756
+ - type: recall_at_1
757
+ value: 21.488833333333336
758
+ - type: recall_at_10
759
+ value: 43.60808333333333
760
+ - type: recall_at_100
761
+ value: 66.5045
762
+ - type: recall_at_1000
763
+ value: 85.70024999999998
764
+ - type: recall_at_3
765
+ value: 31.922166666666662
766
+ - type: recall_at_5
767
+ value: 37.29758333333334
768
+ - task:
769
+ type: Retrieval
770
+ dataset:
771
+ type: BeIR/cqadupstack
772
+ name: MTEB CQADupstackStatsRetrieval
773
+ config: default
774
+ split: test
775
+ revision: None
776
+ metrics:
777
+ - type: map_at_1
778
+ value: 20.781
779
+ - type: map_at_10
780
+ value: 27.173000000000002
781
+ - type: map_at_100
782
+ value: 27.967
783
+ - type: map_at_1000
784
+ value: 28.061999999999998
785
+ - type: map_at_3
786
+ value: 24.973
787
+ - type: map_at_5
788
+ value: 26.279999999999998
789
+ - type: mrr_at_1
790
+ value: 23.773
791
+ - type: mrr_at_10
792
+ value: 29.849999999999998
793
+ - type: mrr_at_100
794
+ value: 30.595
795
+ - type: mrr_at_1000
796
+ value: 30.669
797
+ - type: mrr_at_3
798
+ value: 27.761000000000003
799
+ - type: mrr_at_5
800
+ value: 29.003
801
+ - type: ndcg_at_1
802
+ value: 23.773
803
+ - type: ndcg_at_10
804
+ value: 31.033
805
+ - type: ndcg_at_100
806
+ value: 35.174
807
+ - type: ndcg_at_1000
808
+ value: 37.72
809
+ - type: ndcg_at_3
810
+ value: 26.927
811
+ - type: ndcg_at_5
812
+ value: 29.047
813
+ - type: precision_at_1
814
+ value: 23.773
815
+ - type: precision_at_10
816
+ value: 4.8469999999999995
817
+ - type: precision_at_100
818
+ value: 0.75
819
+ - type: precision_at_1000
820
+ value: 0.104
821
+ - type: precision_at_3
822
+ value: 11.452
823
+ - type: precision_at_5
824
+ value: 8.129
825
+ - type: recall_at_1
826
+ value: 20.781
827
+ - type: recall_at_10
828
+ value: 40.463
829
+ - type: recall_at_100
830
+ value: 59.483
831
+ - type: recall_at_1000
832
+ value: 78.396
833
+ - type: recall_at_3
834
+ value: 29.241
835
+ - type: recall_at_5
836
+ value: 34.544000000000004
837
+ - task:
838
+ type: Retrieval
839
+ dataset:
840
+ type: BeIR/cqadupstack
841
+ name: MTEB CQADupstackTexRetrieval
842
+ config: default
843
+ split: test
844
+ revision: None
845
+ metrics:
846
+ - type: map_at_1
847
+ value: 15.074000000000002
848
+ - type: map_at_10
849
+ value: 20.757
850
+ - type: map_at_100
851
+ value: 21.72
852
+ - type: map_at_1000
853
+ value: 21.844
854
+ - type: map_at_3
855
+ value: 18.929000000000002
856
+ - type: map_at_5
857
+ value: 19.894000000000002
858
+ - type: mrr_at_1
859
+ value: 18.307000000000002
860
+ - type: mrr_at_10
861
+ value: 24.215
862
+ - type: mrr_at_100
863
+ value: 25.083
864
+ - type: mrr_at_1000
865
+ value: 25.168000000000003
866
+ - type: mrr_at_3
867
+ value: 22.316
868
+ - type: mrr_at_5
869
+ value: 23.36
870
+ - type: ndcg_at_1
871
+ value: 18.307000000000002
872
+ - type: ndcg_at_10
873
+ value: 24.651999999999997
874
+ - type: ndcg_at_100
875
+ value: 29.296
876
+ - type: ndcg_at_1000
877
+ value: 32.538
878
+ - type: ndcg_at_3
879
+ value: 21.243000000000002
880
+ - type: ndcg_at_5
881
+ value: 22.727
882
+ - type: precision_at_1
883
+ value: 18.307000000000002
884
+ - type: precision_at_10
885
+ value: 4.446
886
+ - type: precision_at_100
887
+ value: 0.792
888
+ - type: precision_at_1000
889
+ value: 0.124
890
+ - type: precision_at_3
891
+ value: 9.945
892
+ - type: precision_at_5
893
+ value: 7.123
894
+ - type: recall_at_1
895
+ value: 15.074000000000002
896
+ - type: recall_at_10
897
+ value: 33.031
898
+ - type: recall_at_100
899
+ value: 53.954
900
+ - type: recall_at_1000
901
+ value: 77.631
902
+ - type: recall_at_3
903
+ value: 23.253
904
+ - type: recall_at_5
905
+ value: 27.218999999999998
906
+ - task:
907
+ type: Retrieval
908
+ dataset:
909
+ type: BeIR/cqadupstack
910
+ name: MTEB CQADupstackUnixRetrieval
911
+ config: default
912
+ split: test
913
+ revision: None
914
+ metrics:
915
+ - type: map_at_1
916
+ value: 21.04
917
+ - type: map_at_10
918
+ value: 28.226000000000003
919
+ - type: map_at_100
920
+ value: 29.337999999999997
921
+ - type: map_at_1000
922
+ value: 29.448999999999998
923
+ - type: map_at_3
924
+ value: 25.759
925
+ - type: map_at_5
926
+ value: 27.226
927
+ - type: mrr_at_1
928
+ value: 24.067
929
+ - type: mrr_at_10
930
+ value: 31.646
931
+ - type: mrr_at_100
932
+ value: 32.592999999999996
933
+ - type: mrr_at_1000
934
+ value: 32.668
935
+ - type: mrr_at_3
936
+ value: 29.26
937
+ - type: mrr_at_5
938
+ value: 30.725
939
+ - type: ndcg_at_1
940
+ value: 24.067
941
+ - type: ndcg_at_10
942
+ value: 32.789
943
+ - type: ndcg_at_100
944
+ value: 38.253
945
+ - type: ndcg_at_1000
946
+ value: 40.961
947
+ - type: ndcg_at_3
948
+ value: 28.189999999999998
949
+ - type: ndcg_at_5
950
+ value: 30.557000000000002
951
+ - type: precision_at_1
952
+ value: 24.067
953
+ - type: precision_at_10
954
+ value: 5.532
955
+ - type: precision_at_100
956
+ value: 0.928
957
+ - type: precision_at_1000
958
+ value: 0.128
959
+ - type: precision_at_3
960
+ value: 12.5
961
+ - type: precision_at_5
962
+ value: 9.16
963
+ - type: recall_at_1
964
+ value: 21.04
965
+ - type: recall_at_10
966
+ value: 43.167
967
+ - type: recall_at_100
968
+ value: 67.569
969
+ - type: recall_at_1000
970
+ value: 86.817
971
+ - type: recall_at_3
972
+ value: 31.178
973
+ - type: recall_at_5
974
+ value: 36.730000000000004
975
+ - task:
976
+ type: Retrieval
977
+ dataset:
978
+ type: BeIR/cqadupstack
979
+ name: MTEB CQADupstackWebmastersRetrieval
980
+ config: default
981
+ split: test
982
+ revision: None
983
+ metrics:
984
+ - type: map_at_1
985
+ value: 21.439
986
+ - type: map_at_10
987
+ value: 28.531000000000002
988
+ - type: map_at_100
989
+ value: 29.953999999999997
990
+ - type: map_at_1000
991
+ value: 30.171
992
+ - type: map_at_3
993
+ value: 26.546999999999997
994
+ - type: map_at_5
995
+ value: 27.71
996
+ - type: mrr_at_1
997
+ value: 26.087
998
+ - type: mrr_at_10
999
+ value: 32.635
1000
+ - type: mrr_at_100
1001
+ value: 33.629999999999995
1002
+ - type: mrr_at_1000
1003
+ value: 33.71
1004
+ - type: mrr_at_3
1005
+ value: 30.731
1006
+ - type: mrr_at_5
1007
+ value: 31.807999999999996
1008
+ - type: ndcg_at_1
1009
+ value: 26.087
1010
+ - type: ndcg_at_10
1011
+ value: 32.975
1012
+ - type: ndcg_at_100
1013
+ value: 38.853
1014
+ - type: ndcg_at_1000
1015
+ value: 42.158
1016
+ - type: ndcg_at_3
1017
+ value: 29.894
1018
+ - type: ndcg_at_5
1019
+ value: 31.397000000000002
1020
+ - type: precision_at_1
1021
+ value: 26.087
1022
+ - type: precision_at_10
1023
+ value: 6.2059999999999995
1024
+ - type: precision_at_100
1025
+ value: 1.298
1026
+ - type: precision_at_1000
1027
+ value: 0.22200000000000003
1028
+ - type: precision_at_3
1029
+ value: 14.097000000000001
1030
+ - type: precision_at_5
1031
+ value: 9.959999999999999
1032
+ - type: recall_at_1
1033
+ value: 21.439
1034
+ - type: recall_at_10
1035
+ value: 40.519
1036
+ - type: recall_at_100
1037
+ value: 68.073
1038
+ - type: recall_at_1000
1039
+ value: 89.513
1040
+ - type: recall_at_3
1041
+ value: 31.513
1042
+ - type: recall_at_5
1043
+ value: 35.702
1044
+ - task:
1045
+ type: Retrieval
1046
+ dataset:
1047
+ type: BeIR/cqadupstack
1048
+ name: MTEB CQADupstackWordpressRetrieval
1049
+ config: default
1050
+ split: test
1051
+ revision: None
1052
+ metrics:
1053
+ - type: map_at_1
1054
+ value: 18.983
1055
+ - type: map_at_10
1056
+ value: 24.898
1057
+ - type: map_at_100
1058
+ value: 25.836
1059
+ - type: map_at_1000
1060
+ value: 25.934
1061
+ - type: map_at_3
1062
+ value: 22.467000000000002
1063
+ - type: map_at_5
1064
+ value: 24.019
1065
+ - type: mrr_at_1
1066
+ value: 20.333000000000002
1067
+ - type: mrr_at_10
1068
+ value: 26.555
1069
+ - type: mrr_at_100
1070
+ value: 27.369
1071
+ - type: mrr_at_1000
1072
+ value: 27.448
1073
+ - type: mrr_at_3
1074
+ value: 24.091
1075
+ - type: mrr_at_5
1076
+ value: 25.662000000000003
1077
+ - type: ndcg_at_1
1078
+ value: 20.333000000000002
1079
+ - type: ndcg_at_10
1080
+ value: 28.834
1081
+ - type: ndcg_at_100
1082
+ value: 33.722
1083
+ - type: ndcg_at_1000
1084
+ value: 36.475
1085
+ - type: ndcg_at_3
1086
+ value: 24.08
1087
+ - type: ndcg_at_5
1088
+ value: 26.732
1089
+ - type: precision_at_1
1090
+ value: 20.333000000000002
1091
+ - type: precision_at_10
1092
+ value: 4.603
1093
+ - type: precision_at_100
1094
+ value: 0.771
1095
+ - type: precision_at_1000
1096
+ value: 0.11100000000000002
1097
+ - type: precision_at_3
1098
+ value: 9.982000000000001
1099
+ - type: precision_at_5
1100
+ value: 7.6160000000000005
1101
+ - type: recall_at_1
1102
+ value: 18.983
1103
+ - type: recall_at_10
1104
+ value: 39.35
1105
+ - type: recall_at_100
1106
+ value: 62.559
1107
+ - type: recall_at_1000
1108
+ value: 83.623
1109
+ - type: recall_at_3
1110
+ value: 26.799
1111
+ - type: recall_at_5
1112
+ value: 32.997
1113
+ - task:
1114
+ type: Retrieval
1115
+ dataset:
1116
+ type: climate-fever
1117
+ name: MTEB ClimateFEVER
1118
+ config: default
1119
+ split: test
1120
+ revision: None
1121
+ metrics:
1122
+ - type: map_at_1
1123
+ value: 10.621
1124
+ - type: map_at_10
1125
+ value: 17.298
1126
+ - type: map_at_100
1127
+ value: 18.983
1128
+ - type: map_at_1000
1129
+ value: 19.182
1130
+ - type: map_at_3
1131
+ value: 14.552999999999999
1132
+ - type: map_at_5
1133
+ value: 15.912
1134
+ - type: mrr_at_1
1135
+ value: 23.453
1136
+ - type: mrr_at_10
1137
+ value: 33.932
1138
+ - type: mrr_at_100
1139
+ value: 34.891
1140
+ - type: mrr_at_1000
1141
+ value: 34.943000000000005
1142
+ - type: mrr_at_3
1143
+ value: 30.770999999999997
1144
+ - type: mrr_at_5
1145
+ value: 32.556000000000004
1146
+ - type: ndcg_at_1
1147
+ value: 23.453
1148
+ - type: ndcg_at_10
1149
+ value: 24.771
1150
+ - type: ndcg_at_100
1151
+ value: 31.738
1152
+ - type: ndcg_at_1000
1153
+ value: 35.419
1154
+ - type: ndcg_at_3
1155
+ value: 20.22
1156
+ - type: ndcg_at_5
1157
+ value: 21.698999999999998
1158
+ - type: precision_at_1
1159
+ value: 23.453
1160
+ - type: precision_at_10
1161
+ value: 7.785
1162
+ - type: precision_at_100
1163
+ value: 1.5270000000000001
1164
+ - type: precision_at_1000
1165
+ value: 0.22
1166
+ - type: precision_at_3
1167
+ value: 14.962
1168
+ - type: precision_at_5
1169
+ value: 11.401
1170
+ - type: recall_at_1
1171
+ value: 10.621
1172
+ - type: recall_at_10
1173
+ value: 29.726000000000003
1174
+ - type: recall_at_100
1175
+ value: 53.996
1176
+ - type: recall_at_1000
1177
+ value: 74.878
1178
+ - type: recall_at_3
1179
+ value: 18.572
1180
+ - type: recall_at_5
1181
+ value: 22.994999999999997
1182
+ - task:
1183
+ type: Retrieval
1184
+ dataset:
1185
+ type: dbpedia-entity
1186
+ name: MTEB DBPedia
1187
+ config: default
1188
+ split: test
1189
+ revision: None
1190
+ metrics:
1191
+ - type: map_at_1
1192
+ value: 6.819
1193
+ - type: map_at_10
1194
+ value: 14.188
1195
+ - type: map_at_100
1196
+ value: 19.627
1197
+ - type: map_at_1000
1198
+ value: 20.757
1199
+ - type: map_at_3
1200
+ value: 10.352
1201
+ - type: map_at_5
1202
+ value: 12.096
1203
+ - type: mrr_at_1
1204
+ value: 54.25
1205
+ - type: mrr_at_10
1206
+ value: 63.798
1207
+ - type: mrr_at_100
1208
+ value: 64.25
1209
+ - type: mrr_at_1000
1210
+ value: 64.268
1211
+ - type: mrr_at_3
1212
+ value: 61.667
1213
+ - type: mrr_at_5
1214
+ value: 63.153999999999996
1215
+ - type: ndcg_at_1
1216
+ value: 39.5
1217
+ - type: ndcg_at_10
1218
+ value: 31.064999999999998
1219
+ - type: ndcg_at_100
1220
+ value: 34.701
1221
+ - type: ndcg_at_1000
1222
+ value: 41.687000000000005
1223
+ - type: ndcg_at_3
1224
+ value: 34.455999999999996
1225
+ - type: ndcg_at_5
1226
+ value: 32.919
1227
+ - type: precision_at_1
1228
+ value: 54.25
1229
+ - type: precision_at_10
1230
+ value: 25.4
1231
+ - type: precision_at_100
1232
+ value: 7.79
1233
+ - type: precision_at_1000
1234
+ value: 1.577
1235
+ - type: precision_at_3
1236
+ value: 39.333
1237
+ - type: precision_at_5
1238
+ value: 33.6
1239
+ - type: recall_at_1
1240
+ value: 6.819
1241
+ - type: recall_at_10
1242
+ value: 19.134
1243
+ - type: recall_at_100
1244
+ value: 41.191
1245
+ - type: recall_at_1000
1246
+ value: 64.699
1247
+ - type: recall_at_3
1248
+ value: 11.637
1249
+ - type: recall_at_5
1250
+ value: 14.807
1251
+ - task:
1252
+ type: Classification
1253
+ dataset:
1254
+ type: mteb/emotion
1255
+ name: MTEB EmotionClassification
1256
+ config: default
1257
+ split: test
1258
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1259
+ metrics:
1260
+ - type: accuracy
1261
+ value: 42.474999999999994
1262
+ - type: f1
1263
+ value: 37.79154895614037
1264
+ - task:
1265
+ type: Retrieval
1266
+ dataset:
1267
+ type: fever
1268
+ name: MTEB FEVER
1269
+ config: default
1270
+ split: test
1271
+ revision: None
1272
+ metrics:
1273
+ - type: map_at_1
1274
+ value: 53.187
1275
+ - type: map_at_10
1276
+ value: 64.031
1277
+ - type: map_at_100
1278
+ value: 64.507
1279
+ - type: map_at_1000
1280
+ value: 64.526
1281
+ - type: map_at_3
1282
+ value: 61.926
1283
+ - type: map_at_5
1284
+ value: 63.278999999999996
1285
+ - type: mrr_at_1
1286
+ value: 57.396
1287
+ - type: mrr_at_10
1288
+ value: 68.296
1289
+ - type: mrr_at_100
1290
+ value: 68.679
1291
+ - type: mrr_at_1000
1292
+ value: 68.688
1293
+ - type: mrr_at_3
1294
+ value: 66.289
1295
+ - type: mrr_at_5
1296
+ value: 67.593
1297
+ - type: ndcg_at_1
1298
+ value: 57.396
1299
+ - type: ndcg_at_10
1300
+ value: 69.64
1301
+ - type: ndcg_at_100
1302
+ value: 71.75399999999999
1303
+ - type: ndcg_at_1000
1304
+ value: 72.179
1305
+ - type: ndcg_at_3
1306
+ value: 65.66199999999999
1307
+ - type: ndcg_at_5
1308
+ value: 67.932
1309
+ - type: precision_at_1
1310
+ value: 57.396
1311
+ - type: precision_at_10
1312
+ value: 9.073
1313
+ - type: precision_at_100
1314
+ value: 1.024
1315
+ - type: precision_at_1000
1316
+ value: 0.107
1317
+ - type: precision_at_3
1318
+ value: 26.133
1319
+ - type: precision_at_5
1320
+ value: 16.943
1321
+ - type: recall_at_1
1322
+ value: 53.187
1323
+ - type: recall_at_10
1324
+ value: 82.839
1325
+ - type: recall_at_100
1326
+ value: 92.231
1327
+ - type: recall_at_1000
1328
+ value: 95.249
1329
+ - type: recall_at_3
1330
+ value: 72.077
1331
+ - type: recall_at_5
1332
+ value: 77.667
1333
+ - task:
1334
+ type: Retrieval
1335
+ dataset:
1336
+ type: fiqa
1337
+ name: MTEB FiQA2018
1338
+ config: default
1339
+ split: test
1340
+ revision: None
1341
+ metrics:
1342
+ - type: map_at_1
1343
+ value: 10.957
1344
+ - type: map_at_10
1345
+ value: 18.427
1346
+ - type: map_at_100
1347
+ value: 19.885
1348
+ - type: map_at_1000
1349
+ value: 20.088
1350
+ - type: map_at_3
1351
+ value: 15.709000000000001
1352
+ - type: map_at_5
1353
+ value: 17.153
1354
+ - type: mrr_at_1
1355
+ value: 22.377
1356
+ - type: mrr_at_10
1357
+ value: 30.076999999999998
1358
+ - type: mrr_at_100
1359
+ value: 31.233
1360
+ - type: mrr_at_1000
1361
+ value: 31.311
1362
+ - type: mrr_at_3
1363
+ value: 27.521
1364
+ - type: mrr_at_5
1365
+ value: 29.025000000000002
1366
+ - type: ndcg_at_1
1367
+ value: 22.377
1368
+ - type: ndcg_at_10
1369
+ value: 24.367
1370
+ - type: ndcg_at_100
1371
+ value: 31.04
1372
+ - type: ndcg_at_1000
1373
+ value: 35.106
1374
+ - type: ndcg_at_3
1375
+ value: 21.051000000000002
1376
+ - type: ndcg_at_5
1377
+ value: 22.231
1378
+ - type: precision_at_1
1379
+ value: 22.377
1380
+ - type: precision_at_10
1381
+ value: 7.005999999999999
1382
+ - type: precision_at_100
1383
+ value: 1.3599999999999999
1384
+ - type: precision_at_1000
1385
+ value: 0.208
1386
+ - type: precision_at_3
1387
+ value: 13.991999999999999
1388
+ - type: precision_at_5
1389
+ value: 10.833
1390
+ - type: recall_at_1
1391
+ value: 10.957
1392
+ - type: recall_at_10
1393
+ value: 30.274
1394
+ - type: recall_at_100
1395
+ value: 55.982
1396
+ - type: recall_at_1000
1397
+ value: 80.757
1398
+ - type: recall_at_3
1399
+ value: 19.55
1400
+ - type: recall_at_5
1401
+ value: 24.105999999999998
1402
+ - task:
1403
+ type: Retrieval
1404
+ dataset:
1405
+ type: hotpotqa
1406
+ name: MTEB HotpotQA
1407
+ config: default
1408
+ split: test
1409
+ revision: None
1410
+ metrics:
1411
+ - type: map_at_1
1412
+ value: 29.526999999999997
1413
+ - type: map_at_10
1414
+ value: 40.714
1415
+ - type: map_at_100
1416
+ value: 41.655
1417
+ - type: map_at_1000
1418
+ value: 41.744
1419
+ - type: map_at_3
1420
+ value: 38.171
1421
+ - type: map_at_5
1422
+ value: 39.646
1423
+ - type: mrr_at_1
1424
+ value: 59.055
1425
+ - type: mrr_at_10
1426
+ value: 66.411
1427
+ - type: mrr_at_100
1428
+ value: 66.85900000000001
1429
+ - type: mrr_at_1000
1430
+ value: 66.88300000000001
1431
+ - type: mrr_at_3
1432
+ value: 64.846
1433
+ - type: mrr_at_5
1434
+ value: 65.824
1435
+ - type: ndcg_at_1
1436
+ value: 59.055
1437
+ - type: ndcg_at_10
1438
+ value: 49.732
1439
+ - type: ndcg_at_100
1440
+ value: 53.441
1441
+ - type: ndcg_at_1000
1442
+ value: 55.354000000000006
1443
+ - type: ndcg_at_3
1444
+ value: 45.551
1445
+ - type: ndcg_at_5
1446
+ value: 47.719
1447
+ - type: precision_at_1
1448
+ value: 59.055
1449
+ - type: precision_at_10
1450
+ value: 10.366
1451
+ - type: precision_at_100
1452
+ value: 1.328
1453
+ - type: precision_at_1000
1454
+ value: 0.158
1455
+ - type: precision_at_3
1456
+ value: 28.322999999999997
1457
+ - type: precision_at_5
1458
+ value: 18.709
1459
+ - type: recall_at_1
1460
+ value: 29.526999999999997
1461
+ - type: recall_at_10
1462
+ value: 51.83
1463
+ - type: recall_at_100
1464
+ value: 66.42099999999999
1465
+ - type: recall_at_1000
1466
+ value: 79.176
1467
+ - type: recall_at_3
1468
+ value: 42.485
1469
+ - type: recall_at_5
1470
+ value: 46.772000000000006
1471
+ - task:
1472
+ type: Classification
1473
+ dataset:
1474
+ type: mteb/imdb
1475
+ name: MTEB ImdbClassification
1476
+ config: default
1477
+ split: test
1478
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1479
+ metrics:
1480
+ - type: accuracy
1481
+ value: 70.69959999999999
1482
+ - type: ap
1483
+ value: 64.95539314492567
1484
+ - type: f1
1485
+ value: 70.5554935943308
1486
+ - task:
1487
+ type: Retrieval
1488
+ dataset:
1489
+ type: msmarco
1490
+ name: MTEB MSMARCO
1491
+ config: default
1492
+ split: dev
1493
+ revision: None
1494
+ metrics:
1495
+ - type: map_at_1
1496
+ value: 13.153
1497
+ - type: map_at_10
1498
+ value: 22.277
1499
+ - type: map_at_100
1500
+ value: 23.462
1501
+ - type: map_at_1000
1502
+ value: 23.546
1503
+ - type: map_at_3
1504
+ value: 19.026
1505
+ - type: map_at_5
1506
+ value: 20.825
1507
+ - type: mrr_at_1
1508
+ value: 13.539000000000001
1509
+ - type: mrr_at_10
1510
+ value: 22.753
1511
+ - type: mrr_at_100
1512
+ value: 23.906
1513
+ - type: mrr_at_1000
1514
+ value: 23.982999999999997
1515
+ - type: mrr_at_3
1516
+ value: 19.484
1517
+ - type: mrr_at_5
1518
+ value: 21.306
1519
+ - type: ndcg_at_1
1520
+ value: 13.553
1521
+ - type: ndcg_at_10
1522
+ value: 27.848
1523
+ - type: ndcg_at_100
1524
+ value: 33.900999999999996
1525
+ - type: ndcg_at_1000
1526
+ value: 36.155
1527
+ - type: ndcg_at_3
1528
+ value: 21.116
1529
+ - type: ndcg_at_5
1530
+ value: 24.349999999999998
1531
+ - type: precision_at_1
1532
+ value: 13.553
1533
+ - type: precision_at_10
1534
+ value: 4.695
1535
+ - type: precision_at_100
1536
+ value: 0.7779999999999999
1537
+ - type: precision_at_1000
1538
+ value: 0.097
1539
+ - type: precision_at_3
1540
+ value: 9.207
1541
+ - type: precision_at_5
1542
+ value: 7.155
1543
+ - type: recall_at_1
1544
+ value: 13.153
1545
+ - type: recall_at_10
1546
+ value: 45.205
1547
+ - type: recall_at_100
1548
+ value: 73.978
1549
+ - type: recall_at_1000
1550
+ value: 91.541
1551
+ - type: recall_at_3
1552
+ value: 26.735
1553
+ - type: recall_at_5
1554
+ value: 34.493
1555
+ - task:
1556
+ type: Classification
1557
+ dataset:
1558
+ type: mteb/mtop_domain
1559
+ name: MTEB MTOPDomainClassification (en)
1560
+ config: en
1561
+ split: test
1562
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1563
+ metrics:
1564
+ - type: accuracy
1565
+ value: 90.2530779753762
1566
+ - type: f1
1567
+ value: 89.59402328284126
1568
+ - task:
1569
+ type: Classification
1570
+ dataset:
1571
+ type: mteb/mtop_intent
1572
+ name: MTEB MTOPIntentClassification (en)
1573
+ config: en
1574
+ split: test
1575
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1576
+ metrics:
1577
+ - type: accuracy
1578
+ value: 67.95029639762883
1579
+ - type: f1
1580
+ value: 48.99988836758662
1581
+ - task:
1582
+ type: Classification
1583
+ dataset:
1584
+ type: mteb/amazon_massive_intent
1585
+ name: MTEB MassiveIntentClassification (en)
1586
+ config: en
1587
+ split: test
1588
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1589
+ metrics:
1590
+ - type: accuracy
1591
+ value: 67.77740416946874
1592
+ - type: f1
1593
+ value: 66.21341120969817
1594
+ - task:
1595
+ type: Classification
1596
+ dataset:
1597
+ type: mteb/amazon_massive_scenario
1598
+ name: MTEB MassiveScenarioClassification (en)
1599
+ config: en
1600
+ split: test
1601
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1602
+ metrics:
1603
+ - type: accuracy
1604
+ value: 73.03631472763955
1605
+ - type: f1
1606
+ value: 72.5779336237941
1607
+ - task:
1608
+ type: Clustering
1609
+ dataset:
1610
+ type: mteb/medrxiv-clustering-p2p
1611
+ name: MTEB MedrxivClusteringP2P
1612
+ config: default
1613
+ split: test
1614
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1615
+ metrics:
1616
+ - type: v_measure
1617
+ value: 31.98182669158824
1618
+ - task:
1619
+ type: Clustering
1620
+ dataset:
1621
+ type: mteb/medrxiv-clustering-s2s
1622
+ name: MTEB MedrxivClusteringS2S
1623
+ config: default
1624
+ split: test
1625
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1626
+ metrics:
1627
+ - type: v_measure
1628
+ value: 29.259462874407582
1629
+ - task:
1630
+ type: Reranking
1631
+ dataset:
1632
+ type: mteb/mind_small
1633
+ name: MTEB MindSmallReranking
1634
+ config: default
1635
+ split: test
1636
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1637
+ metrics:
1638
+ - type: map
1639
+ value: 31.29342377286548
1640
+ - type: mrr
1641
+ value: 32.32805799117226
1642
+ - task:
1643
+ type: Retrieval
1644
+ dataset:
1645
+ type: nfcorpus
1646
+ name: MTEB NFCorpus
1647
+ config: default
1648
+ split: test
1649
+ revision: None
1650
+ metrics:
1651
+ - type: map_at_1
1652
+ value: 4.692
1653
+ - type: map_at_10
1654
+ value: 10.559000000000001
1655
+ - type: map_at_100
1656
+ value: 13.665
1657
+ - type: map_at_1000
1658
+ value: 15.082
1659
+ - type: map_at_3
1660
+ value: 7.68
1661
+ - type: map_at_5
1662
+ value: 8.844000000000001
1663
+ - type: mrr_at_1
1664
+ value: 38.7
1665
+ - type: mrr_at_10
1666
+ value: 47.864000000000004
1667
+ - type: mrr_at_100
1668
+ value: 48.583999999999996
1669
+ - type: mrr_at_1000
1670
+ value: 48.636
1671
+ - type: mrr_at_3
1672
+ value: 45.975
1673
+ - type: mrr_at_5
1674
+ value: 47.074
1675
+ - type: ndcg_at_1
1676
+ value: 36.378
1677
+ - type: ndcg_at_10
1678
+ value: 30.038999999999998
1679
+ - type: ndcg_at_100
1680
+ value: 28.226000000000003
1681
+ - type: ndcg_at_1000
1682
+ value: 36.958
1683
+ - type: ndcg_at_3
1684
+ value: 33.469
1685
+ - type: ndcg_at_5
1686
+ value: 32.096999999999994
1687
+ - type: precision_at_1
1688
+ value: 38.080000000000005
1689
+ - type: precision_at_10
1690
+ value: 22.941
1691
+ - type: precision_at_100
1692
+ value: 7.632
1693
+ - type: precision_at_1000
1694
+ value: 2.0420000000000003
1695
+ - type: precision_at_3
1696
+ value: 31.579
1697
+ - type: precision_at_5
1698
+ value: 28.235
1699
+ - type: recall_at_1
1700
+ value: 4.692
1701
+ - type: recall_at_10
1702
+ value: 14.496
1703
+ - type: recall_at_100
1704
+ value: 29.69
1705
+ - type: recall_at_1000
1706
+ value: 61.229
1707
+ - type: recall_at_3
1708
+ value: 8.871
1709
+ - type: recall_at_5
1710
+ value: 10.825999999999999
1711
+ - task:
1712
+ type: Retrieval
1713
+ dataset:
1714
+ type: nq
1715
+ name: MTEB NQ
1716
+ config: default
1717
+ split: test
1718
+ revision: None
1719
+ metrics:
1720
+ - type: map_at_1
1721
+ value: 13.120000000000001
1722
+ - type: map_at_10
1723
+ value: 24.092
1724
+ - type: map_at_100
1725
+ value: 25.485999999999997
1726
+ - type: map_at_1000
1727
+ value: 25.557999999999996
1728
+ - type: map_at_3
1729
+ value: 20.076
1730
+ - type: map_at_5
1731
+ value: 22.368
1732
+ - type: mrr_at_1
1733
+ value: 15.093
1734
+ - type: mrr_at_10
1735
+ value: 26.142
1736
+ - type: mrr_at_100
1737
+ value: 27.301
1738
+ - type: mrr_at_1000
1739
+ value: 27.357
1740
+ - type: mrr_at_3
1741
+ value: 22.364
1742
+ - type: mrr_at_5
1743
+ value: 24.564
1744
+ - type: ndcg_at_1
1745
+ value: 15.093
1746
+ - type: ndcg_at_10
1747
+ value: 30.734
1748
+ - type: ndcg_at_100
1749
+ value: 37.147999999999996
1750
+ - type: ndcg_at_1000
1751
+ value: 38.997
1752
+ - type: ndcg_at_3
1753
+ value: 22.82
1754
+ - type: ndcg_at_5
1755
+ value: 26.806
1756
+ - type: precision_at_1
1757
+ value: 15.093
1758
+ - type: precision_at_10
1759
+ value: 5.863
1760
+ - type: precision_at_100
1761
+ value: 0.942
1762
+ - type: precision_at_1000
1763
+ value: 0.11199999999999999
1764
+ - type: precision_at_3
1765
+ value: 11.047
1766
+ - type: precision_at_5
1767
+ value: 8.863999999999999
1768
+ - type: recall_at_1
1769
+ value: 13.120000000000001
1770
+ - type: recall_at_10
1771
+ value: 49.189
1772
+ - type: recall_at_100
1773
+ value: 78.032
1774
+ - type: recall_at_1000
1775
+ value: 92.034
1776
+ - type: recall_at_3
1777
+ value: 28.483000000000004
1778
+ - type: recall_at_5
1779
+ value: 37.756
1780
+ - task:
1781
+ type: Retrieval
1782
+ dataset:
1783
+ type: quora
1784
+ name: MTEB QuoraRetrieval
1785
+ config: default
1786
+ split: test
1787
+ revision: None
1788
+ metrics:
1789
+ - type: map_at_1
1790
+ value: 67.765
1791
+ - type: map_at_10
1792
+ value: 81.069
1793
+ - type: map_at_100
1794
+ value: 81.757
1795
+ - type: map_at_1000
1796
+ value: 81.782
1797
+ - type: map_at_3
1798
+ value: 78.148
1799
+ - type: map_at_5
1800
+ value: 79.95400000000001
1801
+ - type: mrr_at_1
1802
+ value: 77.8
1803
+ - type: mrr_at_10
1804
+ value: 84.639
1805
+ - type: mrr_at_100
1806
+ value: 84.789
1807
+ - type: mrr_at_1000
1808
+ value: 84.79100000000001
1809
+ - type: mrr_at_3
1810
+ value: 83.467
1811
+ - type: mrr_at_5
1812
+ value: 84.251
1813
+ - type: ndcg_at_1
1814
+ value: 77.82
1815
+ - type: ndcg_at_10
1816
+ value: 85.286
1817
+ - type: ndcg_at_100
1818
+ value: 86.86500000000001
1819
+ - type: ndcg_at_1000
1820
+ value: 87.062
1821
+ - type: ndcg_at_3
1822
+ value: 82.116
1823
+ - type: ndcg_at_5
1824
+ value: 83.811
1825
+ - type: precision_at_1
1826
+ value: 77.82
1827
+ - type: precision_at_10
1828
+ value: 12.867999999999999
1829
+ - type: precision_at_100
1830
+ value: 1.498
1831
+ - type: precision_at_1000
1832
+ value: 0.156
1833
+ - type: precision_at_3
1834
+ value: 35.723
1835
+ - type: precision_at_5
1836
+ value: 23.52
1837
+ - type: recall_at_1
1838
+ value: 67.765
1839
+ - type: recall_at_10
1840
+ value: 93.381
1841
+ - type: recall_at_100
1842
+ value: 98.901
1843
+ - type: recall_at_1000
1844
+ value: 99.864
1845
+ - type: recall_at_3
1846
+ value: 84.301
1847
+ - type: recall_at_5
1848
+ value: 89.049
1849
+ - task:
1850
+ type: Clustering
1851
+ dataset:
1852
+ type: mteb/reddit-clustering
1853
+ name: MTEB RedditClustering
1854
+ config: default
1855
+ split: test
1856
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1857
+ metrics:
1858
+ - type: v_measure
1859
+ value: 45.27190981742137
1860
+ - task:
1861
+ type: Clustering
1862
+ dataset:
1863
+ type: mteb/reddit-clustering-p2p
1864
+ name: MTEB RedditClusteringP2P
1865
+ config: default
1866
+ split: test
1867
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1868
+ metrics:
1869
+ - type: v_measure
1870
+ value: 54.47444004585028
1871
+ - task:
1872
+ type: Retrieval
1873
+ dataset:
1874
+ type: scidocs
1875
+ name: MTEB SCIDOCS
1876
+ config: default
1877
+ split: test
1878
+ revision: None
1879
+ metrics:
1880
+ - type: map_at_1
1881
+ value: 4.213
1882
+ - type: map_at_10
1883
+ value: 10.166
1884
+ - type: map_at_100
1885
+ value: 11.987
1886
+ - type: map_at_1000
1887
+ value: 12.285
1888
+ - type: map_at_3
1889
+ value: 7.538
1890
+ - type: map_at_5
1891
+ value: 8.606
1892
+ - type: mrr_at_1
1893
+ value: 20.8
1894
+ - type: mrr_at_10
1895
+ value: 30.066
1896
+ - type: mrr_at_100
1897
+ value: 31.290000000000003
1898
+ - type: mrr_at_1000
1899
+ value: 31.357000000000003
1900
+ - type: mrr_at_3
1901
+ value: 27.083000000000002
1902
+ - type: mrr_at_5
1903
+ value: 28.748
1904
+ - type: ndcg_at_1
1905
+ value: 20.8
1906
+ - type: ndcg_at_10
1907
+ value: 17.258000000000003
1908
+ - type: ndcg_at_100
1909
+ value: 24.801000000000002
1910
+ - type: ndcg_at_1000
1911
+ value: 30.348999999999997
1912
+ - type: ndcg_at_3
1913
+ value: 16.719
1914
+ - type: ndcg_at_5
1915
+ value: 14.145
1916
+ - type: precision_at_1
1917
+ value: 20.8
1918
+ - type: precision_at_10
1919
+ value: 8.88
1920
+ - type: precision_at_100
1921
+ value: 1.9789999999999999
1922
+ - type: precision_at_1000
1923
+ value: 0.332
1924
+ - type: precision_at_3
1925
+ value: 15.5
1926
+ - type: precision_at_5
1927
+ value: 12.1
1928
+ - type: recall_at_1
1929
+ value: 4.213
1930
+ - type: recall_at_10
1931
+ value: 17.983
1932
+ - type: recall_at_100
1933
+ value: 40.167
1934
+ - type: recall_at_1000
1935
+ value: 67.43
1936
+ - type: recall_at_3
1937
+ value: 9.433
1938
+ - type: recall_at_5
1939
+ value: 12.267999999999999
1940
+ - task:
1941
+ type: STS
1942
+ dataset:
1943
+ type: mteb/sickr-sts
1944
+ name: MTEB SICK-R
1945
+ config: default
1946
+ split: test
1947
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1948
+ metrics:
1949
+ - type: cos_sim_pearson
1950
+ value: 80.36742239848913
1951
+ - type: cos_sim_spearman
1952
+ value: 72.39470010828755
1953
+ - type: euclidean_pearson
1954
+ value: 77.26919895870947
1955
+ - type: euclidean_spearman
1956
+ value: 72.26534999077315
1957
+ - type: manhattan_pearson
1958
+ value: 77.04066349814258
1959
+ - type: manhattan_spearman
1960
+ value: 72.0072248699278
1961
+ - task:
1962
+ type: STS
1963
+ dataset:
1964
+ type: mteb/sts12-sts
1965
+ name: MTEB STS12
1966
+ config: default
1967
+ split: test
1968
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1969
+ metrics:
1970
+ - type: cos_sim_pearson
1971
+ value: 80.26991474037257
1972
+ - type: cos_sim_spearman
1973
+ value: 71.90287122017716
1974
+ - type: euclidean_pearson
1975
+ value: 76.68006075912453
1976
+ - type: euclidean_spearman
1977
+ value: 71.69301858764365
1978
+ - type: manhattan_pearson
1979
+ value: 76.72277285842371
1980
+ - type: manhattan_spearman
1981
+ value: 71.73265239703795
1982
+ - task:
1983
+ type: STS
1984
+ dataset:
1985
+ type: mteb/sts13-sts
1986
+ name: MTEB STS13
1987
+ config: default
1988
+ split: test
1989
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1990
+ metrics:
1991
+ - type: cos_sim_pearson
1992
+ value: 79.74371413317881
1993
+ - type: cos_sim_spearman
1994
+ value: 80.9279612820358
1995
+ - type: euclidean_pearson
1996
+ value: 80.6417435294782
1997
+ - type: euclidean_spearman
1998
+ value: 81.17460969254459
1999
+ - type: manhattan_pearson
2000
+ value: 80.51820155178402
2001
+ - type: manhattan_spearman
2002
+ value: 81.08028700017084
2003
+ - task:
2004
+ type: STS
2005
+ dataset:
2006
+ type: mteb/sts14-sts
2007
+ name: MTEB STS14
2008
+ config: default
2009
+ split: test
2010
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
2011
+ metrics:
2012
+ - type: cos_sim_pearson
2013
+ value: 80.37085777051112
2014
+ - type: cos_sim_spearman
2015
+ value: 76.60308382518285
2016
+ - type: euclidean_pearson
2017
+ value: 79.59684787227351
2018
+ - type: euclidean_spearman
2019
+ value: 76.8769048249242
2020
+ - type: manhattan_pearson
2021
+ value: 79.55617632538295
2022
+ - type: manhattan_spearman
2023
+ value: 76.90186497973124
2024
+ - task:
2025
+ type: STS
2026
+ dataset:
2027
+ type: mteb/sts15-sts
2028
+ name: MTEB STS15
2029
+ config: default
2030
+ split: test
2031
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
2032
+ metrics:
2033
+ - type: cos_sim_pearson
2034
+ value: 83.99513105301321
2035
+ - type: cos_sim_spearman
2036
+ value: 84.92034548133665
2037
+ - type: euclidean_pearson
2038
+ value: 84.70872540095195
2039
+ - type: euclidean_spearman
2040
+ value: 85.14591726040749
2041
+ - type: manhattan_pearson
2042
+ value: 84.65707417430595
2043
+ - type: manhattan_spearman
2044
+ value: 85.10407163865375
2045
+ - task:
2046
+ type: STS
2047
+ dataset:
2048
+ type: mteb/sts16-sts
2049
+ name: MTEB STS16
2050
+ config: default
2051
+ split: test
2052
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
2053
+ metrics:
2054
+ - type: cos_sim_pearson
2055
+ value: 79.40758449150897
2056
+ - type: cos_sim_spearman
2057
+ value: 80.71692246880549
2058
+ - type: euclidean_pearson
2059
+ value: 80.51658552062683
2060
+ - type: euclidean_spearman
2061
+ value: 80.87118389043233
2062
+ - type: manhattan_pearson
2063
+ value: 80.41534690825016
2064
+ - type: manhattan_spearman
2065
+ value: 80.73925282537256
2066
+ - task:
2067
+ type: STS
2068
+ dataset:
2069
+ type: mteb/sts17-crosslingual-sts
2070
+ name: MTEB STS17 (en-en)
2071
+ config: en-en
2072
+ split: test
2073
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
2074
+ metrics:
2075
+ - type: cos_sim_pearson
2076
+ value: 84.93617076910748
2077
+ - type: cos_sim_spearman
2078
+ value: 85.61118538966805
2079
+ - type: euclidean_pearson
2080
+ value: 85.56187558635287
2081
+ - type: euclidean_spearman
2082
+ value: 85.21910090757267
2083
+ - type: manhattan_pearson
2084
+ value: 85.29916699037645
2085
+ - type: manhattan_spearman
2086
+ value: 84.96820527868671
2087
+ - task:
2088
+ type: STS
2089
+ dataset:
2090
+ type: mteb/sts22-crosslingual-sts
2091
+ name: MTEB STS22 (en)
2092
+ config: en
2093
+ split: test
2094
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2095
+ metrics:
2096
+ - type: cos_sim_pearson
2097
+ value: 64.22294088543077
2098
+ - type: cos_sim_spearman
2099
+ value: 65.89748502901078
2100
+ - type: euclidean_pearson
2101
+ value: 66.15637850660805
2102
+ - type: euclidean_spearman
2103
+ value: 65.86095841381278
2104
+ - type: manhattan_pearson
2105
+ value: 66.80966197857856
2106
+ - type: manhattan_spearman
2107
+ value: 66.48325202219692
2108
+ - task:
2109
+ type: STS
2110
+ dataset:
2111
+ type: mteb/stsbenchmark-sts
2112
+ name: MTEB STSBenchmark
2113
+ config: default
2114
+ split: test
2115
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2116
+ metrics:
2117
+ - type: cos_sim_pearson
2118
+ value: 81.75298158703048
2119
+ - type: cos_sim_spearman
2120
+ value: 81.32168373072322
2121
+ - type: euclidean_pearson
2122
+ value: 82.3251793712207
2123
+ - type: euclidean_spearman
2124
+ value: 81.31655163330606
2125
+ - type: manhattan_pearson
2126
+ value: 82.14136865023298
2127
+ - type: manhattan_spearman
2128
+ value: 81.13410964028606
2129
+ - task:
2130
+ type: Reranking
2131
+ dataset:
2132
+ type: mteb/scidocs-reranking
2133
+ name: MTEB SciDocsRR
2134
+ config: default
2135
+ split: test
2136
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2137
+ metrics:
2138
+ - type: map
2139
+ value: 78.77937068780793
2140
+ - type: mrr
2141
+ value: 93.334709952357
2142
+ - task:
2143
+ type: Retrieval
2144
+ dataset:
2145
+ type: scifact
2146
+ name: MTEB SciFact
2147
+ config: default
2148
+ split: test
2149
+ revision: None
2150
+ metrics:
2151
+ - type: map_at_1
2152
+ value: 50.705999999999996
2153
+ - type: map_at_10
2154
+ value: 60.699999999999996
2155
+ - type: map_at_100
2156
+ value: 61.256
2157
+ - type: map_at_1000
2158
+ value: 61.285000000000004
2159
+ - type: map_at_3
2160
+ value: 57.633
2161
+ - type: map_at_5
2162
+ value: 59.648
2163
+ - type: mrr_at_1
2164
+ value: 53.0
2165
+ - type: mrr_at_10
2166
+ value: 61.717999999999996
2167
+ - type: mrr_at_100
2168
+ value: 62.165000000000006
2169
+ - type: mrr_at_1000
2170
+ value: 62.190999999999995
2171
+ - type: mrr_at_3
2172
+ value: 59.389
2173
+ - type: mrr_at_5
2174
+ value: 60.922
2175
+ - type: ndcg_at_1
2176
+ value: 53.0
2177
+ - type: ndcg_at_10
2178
+ value: 65.413
2179
+ - type: ndcg_at_100
2180
+ value: 68.089
2181
+ - type: ndcg_at_1000
2182
+ value: 69.01899999999999
2183
+ - type: ndcg_at_3
2184
+ value: 60.327
2185
+ - type: ndcg_at_5
2186
+ value: 63.263999999999996
2187
+ - type: precision_at_1
2188
+ value: 53.0
2189
+ - type: precision_at_10
2190
+ value: 8.933
2191
+ - type: precision_at_100
2192
+ value: 1.04
2193
+ - type: precision_at_1000
2194
+ value: 0.11199999999999999
2195
+ - type: precision_at_3
2196
+ value: 23.778
2197
+ - type: precision_at_5
2198
+ value: 16.2
2199
+ - type: recall_at_1
2200
+ value: 50.705999999999996
2201
+ - type: recall_at_10
2202
+ value: 78.633
2203
+ - type: recall_at_100
2204
+ value: 91.333
2205
+ - type: recall_at_1000
2206
+ value: 99.0
2207
+ - type: recall_at_3
2208
+ value: 65.328
2209
+ - type: recall_at_5
2210
+ value: 72.583
2211
+ - task:
2212
+ type: PairClassification
2213
+ dataset:
2214
+ type: mteb/sprintduplicatequestions-pairclassification
2215
+ name: MTEB SprintDuplicateQuestions
2216
+ config: default
2217
+ split: test
2218
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2219
+ metrics:
2220
+ - type: cos_sim_accuracy
2221
+ value: 99.82178217821782
2222
+ - type: cos_sim_ap
2223
+ value: 95.30078788098801
2224
+ - type: cos_sim_f1
2225
+ value: 91.11549851924975
2226
+ - type: cos_sim_precision
2227
+ value: 89.96101364522417
2228
+ - type: cos_sim_recall
2229
+ value: 92.30000000000001
2230
+ - type: dot_accuracy
2231
+ value: 99.74851485148515
2232
+ - type: dot_ap
2233
+ value: 93.12383012680787
2234
+ - type: dot_f1
2235
+ value: 87.17171717171716
2236
+ - type: dot_precision
2237
+ value: 88.06122448979592
2238
+ - type: dot_recall
2239
+ value: 86.3
2240
+ - type: euclidean_accuracy
2241
+ value: 99.82673267326733
2242
+ - type: euclidean_ap
2243
+ value: 95.29507269622621
2244
+ - type: euclidean_f1
2245
+ value: 91.3151364764268
2246
+ - type: euclidean_precision
2247
+ value: 90.64039408866995
2248
+ - type: euclidean_recall
2249
+ value: 92.0
2250
+ - type: manhattan_accuracy
2251
+ value: 99.82178217821782
2252
+ - type: manhattan_ap
2253
+ value: 95.34300712110257
2254
+ - type: manhattan_f1
2255
+ value: 91.05367793240556
2256
+ - type: manhattan_precision
2257
+ value: 90.51383399209486
2258
+ - type: manhattan_recall
2259
+ value: 91.60000000000001
2260
+ - type: max_accuracy
2261
+ value: 99.82673267326733
2262
+ - type: max_ap
2263
+ value: 95.34300712110257
2264
+ - type: max_f1
2265
+ value: 91.3151364764268
2266
+ - task:
2267
+ type: Clustering
2268
+ dataset:
2269
+ type: mteb/stackexchange-clustering
2270
+ name: MTEB StackExchangeClustering
2271
+ config: default
2272
+ split: test
2273
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2274
+ metrics:
2275
+ - type: v_measure
2276
+ value: 53.10993894014712
2277
+ - task:
2278
+ type: Clustering
2279
+ dataset:
2280
+ type: mteb/stackexchange-clustering-p2p
2281
+ name: MTEB StackExchangeClusteringP2P
2282
+ config: default
2283
+ split: test
2284
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2285
+ metrics:
2286
+ - type: v_measure
2287
+ value: 34.67216071080345
2288
+ - task:
2289
+ type: Reranking
2290
+ dataset:
2291
+ type: mteb/stackoverflowdupquestions-reranking
2292
+ name: MTEB StackOverflowDupQuestions
2293
+ config: default
2294
+ split: test
2295
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2296
+ metrics:
2297
+ - type: map
2298
+ value: 48.96344255085851
2299
+ - type: mrr
2300
+ value: 49.816123419064596
2301
+ - task:
2302
+ type: Summarization
2303
+ dataset:
2304
+ type: mteb/summeval
2305
+ name: MTEB SummEval
2306
+ config: default
2307
+ split: test
2308
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2309
+ metrics:
2310
+ - type: cos_sim_pearson
2311
+ value: 30.580410074992177
2312
+ - type: cos_sim_spearman
2313
+ value: 31.155995112739966
2314
+ - type: dot_pearson
2315
+ value: 31.112094423048998
2316
+ - type: dot_spearman
2317
+ value: 31.29974829801922
2318
+ - task:
2319
+ type: Retrieval
2320
+ dataset:
2321
+ type: trec-covid
2322
+ name: MTEB TRECCOVID
2323
+ config: default
2324
+ split: test
2325
+ revision: None
2326
+ metrics:
2327
+ - type: map_at_1
2328
+ value: 0.17700000000000002
2329
+ - type: map_at_10
2330
+ value: 1.22
2331
+ - type: map_at_100
2332
+ value: 6.2170000000000005
2333
+ - type: map_at_1000
2334
+ value: 15.406
2335
+ - type: map_at_3
2336
+ value: 0.483
2337
+ - type: map_at_5
2338
+ value: 0.729
2339
+ - type: mrr_at_1
2340
+ value: 64.0
2341
+ - type: mrr_at_10
2342
+ value: 76.333
2343
+ - type: mrr_at_100
2344
+ value: 76.47
2345
+ - type: mrr_at_1000
2346
+ value: 76.47
2347
+ - type: mrr_at_3
2348
+ value: 75.0
2349
+ - type: mrr_at_5
2350
+ value: 76.0
2351
+ - type: ndcg_at_1
2352
+ value: 59.0
2353
+ - type: ndcg_at_10
2354
+ value: 52.62
2355
+ - type: ndcg_at_100
2356
+ value: 39.932
2357
+ - type: ndcg_at_1000
2358
+ value: 37.317
2359
+ - type: ndcg_at_3
2360
+ value: 57.123000000000005
2361
+ - type: ndcg_at_5
2362
+ value: 56.376000000000005
2363
+ - type: precision_at_1
2364
+ value: 64.0
2365
+ - type: precision_at_10
2366
+ value: 55.800000000000004
2367
+ - type: precision_at_100
2368
+ value: 41.04
2369
+ - type: precision_at_1000
2370
+ value: 17.124
2371
+ - type: precision_at_3
2372
+ value: 63.333
2373
+ - type: precision_at_5
2374
+ value: 62.0
2375
+ - type: recall_at_1
2376
+ value: 0.17700000000000002
2377
+ - type: recall_at_10
2378
+ value: 1.46
2379
+ - type: recall_at_100
2380
+ value: 9.472999999999999
2381
+ - type: recall_at_1000
2382
+ value: 35.661
2383
+ - type: recall_at_3
2384
+ value: 0.527
2385
+ - type: recall_at_5
2386
+ value: 0.8250000000000001
2387
+ - task:
2388
+ type: Retrieval
2389
+ dataset:
2390
+ type: webis-touche2020
2391
+ name: MTEB Touche2020
2392
+ config: default
2393
+ split: test
2394
+ revision: None
2395
+ metrics:
2396
+ - type: map_at_1
2397
+ value: 1.539
2398
+ - type: map_at_10
2399
+ value: 7.178
2400
+ - type: map_at_100
2401
+ value: 12.543000000000001
2402
+ - type: map_at_1000
2403
+ value: 14.126
2404
+ - type: map_at_3
2405
+ value: 3.09
2406
+ - type: map_at_5
2407
+ value: 5.008
2408
+ - type: mrr_at_1
2409
+ value: 18.367
2410
+ - type: mrr_at_10
2411
+ value: 32.933
2412
+ - type: mrr_at_100
2413
+ value: 34.176
2414
+ - type: mrr_at_1000
2415
+ value: 34.176
2416
+ - type: mrr_at_3
2417
+ value: 27.551
2418
+ - type: mrr_at_5
2419
+ value: 30.714000000000002
2420
+ - type: ndcg_at_1
2421
+ value: 15.306000000000001
2422
+ - type: ndcg_at_10
2423
+ value: 18.343
2424
+ - type: ndcg_at_100
2425
+ value: 30.076000000000004
2426
+ - type: ndcg_at_1000
2427
+ value: 42.266999999999996
2428
+ - type: ndcg_at_3
2429
+ value: 17.233999999999998
2430
+ - type: ndcg_at_5
2431
+ value: 18.677
2432
+ - type: precision_at_1
2433
+ value: 18.367
2434
+ - type: precision_at_10
2435
+ value: 18.367
2436
+ - type: precision_at_100
2437
+ value: 6.837
2438
+ - type: precision_at_1000
2439
+ value: 1.467
2440
+ - type: precision_at_3
2441
+ value: 19.048000000000002
2442
+ - type: precision_at_5
2443
+ value: 21.224
2444
+ - type: recall_at_1
2445
+ value: 1.539
2446
+ - type: recall_at_10
2447
+ value: 13.289000000000001
2448
+ - type: recall_at_100
2449
+ value: 42.480000000000004
2450
+ - type: recall_at_1000
2451
+ value: 79.463
2452
+ - type: recall_at_3
2453
+ value: 4.202999999999999
2454
+ - type: recall_at_5
2455
+ value: 7.9030000000000005
2456
+ - task:
2457
+ type: Classification
2458
+ dataset:
2459
+ type: mteb/toxic_conversations_50k
2460
+ name: MTEB ToxicConversationsClassification
2461
+ config: default
2462
+ split: test
2463
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2464
+ metrics:
2465
+ - type: accuracy
2466
+ value: 69.2056
2467
+ - type: ap
2468
+ value: 13.564165903349778
2469
+ - type: f1
2470
+ value: 53.303385089202656
2471
+ - task:
2472
+ type: Classification
2473
+ dataset:
2474
+ type: mteb/tweet_sentiment_extraction
2475
+ name: MTEB TweetSentimentExtractionClassification
2476
+ config: default
2477
+ split: test
2478
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2479
+ metrics:
2480
+ - type: accuracy
2481
+ value: 56.71477079796264
2482
+ - type: f1
2483
+ value: 57.01563439439609
2484
+ - task:
2485
+ type: Clustering
2486
+ dataset:
2487
+ type: mteb/twentynewsgroups-clustering
2488
+ name: MTEB TwentyNewsgroupsClustering
2489
+ config: default
2490
+ split: test
2491
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2492
+ metrics:
2493
+ - type: v_measure
2494
+ value: 39.373040570976514
2495
+ - task:
2496
+ type: PairClassification
2497
+ dataset:
2498
+ type: mteb/twittersemeval2015-pairclassification
2499
+ name: MTEB TwitterSemEval2015
2500
+ config: default
2501
+ split: test
2502
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2503
+ metrics:
2504
+ - type: cos_sim_accuracy
2505
+ value: 83.44757703999524
2506
+ - type: cos_sim_ap
2507
+ value: 65.78689843625949
2508
+ - type: cos_sim_f1
2509
+ value: 62.25549384206713
2510
+ - type: cos_sim_precision
2511
+ value: 57.39091718610864
2512
+ - type: cos_sim_recall
2513
+ value: 68.02110817941951
2514
+ - type: dot_accuracy
2515
+ value: 81.3971508612982
2516
+ - type: dot_ap
2517
+ value: 58.42933051967154
2518
+ - type: dot_f1
2519
+ value: 57.85580214198962
2520
+ - type: dot_precision
2521
+ value: 49.74368710841086
2522
+ - type: dot_recall
2523
+ value: 69.12928759894459
2524
+ - type: euclidean_accuracy
2525
+ value: 83.54294569946951
2526
+ - type: euclidean_ap
2527
+ value: 66.10612585693795
2528
+ - type: euclidean_f1
2529
+ value: 62.66666666666667
2530
+ - type: euclidean_precision
2531
+ value: 58.88631090487239
2532
+ - type: euclidean_recall
2533
+ value: 66.96569920844327
2534
+ - type: manhattan_accuracy
2535
+ value: 83.43565595756095
2536
+ - type: manhattan_ap
2537
+ value: 65.88532290329134
2538
+ - type: manhattan_f1
2539
+ value: 62.58408721874276
2540
+ - type: manhattan_precision
2541
+ value: 55.836092715231786
2542
+ - type: manhattan_recall
2543
+ value: 71.18733509234828
2544
+ - type: max_accuracy
2545
+ value: 83.54294569946951
2546
+ - type: max_ap
2547
+ value: 66.10612585693795
2548
+ - type: max_f1
2549
+ value: 62.66666666666667
2550
+ - task:
2551
+ type: PairClassification
2552
+ dataset:
2553
+ type: mteb/twitterurlcorpus-pairclassification
2554
+ name: MTEB TwitterURLCorpus
2555
+ config: default
2556
+ split: test
2557
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2558
+ metrics:
2559
+ - type: cos_sim_accuracy
2560
+ value: 88.02344083517679
2561
+ - type: cos_sim_ap
2562
+ value: 84.21589190889944
2563
+ - type: cos_sim_f1
2564
+ value: 76.36723039754007
2565
+ - type: cos_sim_precision
2566
+ value: 72.79134682484299
2567
+ - type: cos_sim_recall
2568
+ value: 80.31259624268556
2569
+ - type: dot_accuracy
2570
+ value: 87.43353902278108
2571
+ - type: dot_ap
2572
+ value: 82.08962394120071
2573
+ - type: dot_f1
2574
+ value: 74.97709923664122
2575
+ - type: dot_precision
2576
+ value: 74.34150772025431
2577
+ - type: dot_recall
2578
+ value: 75.62365260240222
2579
+ - type: euclidean_accuracy
2580
+ value: 87.97686963946133
2581
+ - type: euclidean_ap
2582
+ value: 84.20578083922416
2583
+ - type: euclidean_f1
2584
+ value: 76.4299182903834
2585
+ - type: euclidean_precision
2586
+ value: 73.51874244256348
2587
+ - type: euclidean_recall
2588
+ value: 79.58115183246073
2589
+ - type: manhattan_accuracy
2590
+ value: 88.00209570380719
2591
+ - type: manhattan_ap
2592
+ value: 84.14700304263556
2593
+ - type: manhattan_f1
2594
+ value: 76.36429345861944
2595
+ - type: manhattan_precision
2596
+ value: 71.95886119057349
2597
+ - type: manhattan_recall
2598
+ value: 81.34431783184478
2599
+ - type: max_accuracy
2600
+ value: 88.02344083517679
2601
+ - type: max_ap
2602
+ value: 84.21589190889944
2603
+ - type: max_f1
2604
+ value: 76.4299182903834
2605
  ---
2606
 
2607
+ # bge-micro
2608
 
2609
  This is a [sentence-transformers](https://www.SBERT.net) model: It maps sentences & paragraphs to a 384 dimensional dense vector space and can be used for tasks like clustering or semantic search.
2610
+ It is distilled from [bge-small-en-v1.5](https://huggingface.co/BAAI/bge-small-en-v1.5/blob/main/config.json), with 1/4 the non-embedding parameters.
2611
+ It has 1/2 the parameters of the smallest commonly-used embedding model, all-MiniLM-L6-v2, with similar performance.
2612
 
2613
  <!--- Describe your model here -->
2614