odunola commited on
Commit
a159451
1 Parent(s): f2122b6

adding files

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false
7
+ }
README.md ADDED
@@ -0,0 +1,2674 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - mteb
4
+ model-index:
5
+ - name: e5-base-v2
6
+ results:
7
+ - task:
8
+ type: Classification
9
+ dataset:
10
+ type: mteb/amazon_counterfactual
11
+ name: MTEB AmazonCounterfactualClassification (en)
12
+ config: en
13
+ split: test
14
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
15
+ metrics:
16
+ - type: accuracy
17
+ value: 77.77611940298506
18
+ - type: ap
19
+ value: 42.052710266606056
20
+ - type: f1
21
+ value: 72.12040628266567
22
+ - task:
23
+ type: Classification
24
+ dataset:
25
+ type: mteb/amazon_polarity
26
+ name: MTEB AmazonPolarityClassification
27
+ config: default
28
+ split: test
29
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
30
+ metrics:
31
+ - type: accuracy
32
+ value: 92.81012500000001
33
+ - type: ap
34
+ value: 89.4213700757244
35
+ - type: f1
36
+ value: 92.8039091197065
37
+ - task:
38
+ type: Classification
39
+ dataset:
40
+ type: mteb/amazon_reviews_multi
41
+ name: MTEB AmazonReviewsClassification (en)
42
+ config: en
43
+ split: test
44
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
45
+ metrics:
46
+ - type: accuracy
47
+ value: 46.711999999999996
48
+ - type: f1
49
+ value: 46.11544975436018
50
+ - task:
51
+ type: Retrieval
52
+ dataset:
53
+ type: arguana
54
+ name: MTEB ArguAna
55
+ config: default
56
+ split: test
57
+ revision: None
58
+ metrics:
59
+ - type: map_at_1
60
+ value: 23.186
61
+ - type: map_at_10
62
+ value: 36.632999999999996
63
+ - type: map_at_100
64
+ value: 37.842
65
+ - type: map_at_1000
66
+ value: 37.865
67
+ - type: map_at_3
68
+ value: 32.278
69
+ - type: map_at_5
70
+ value: 34.760999999999996
71
+ - type: mrr_at_1
72
+ value: 23.400000000000002
73
+ - type: mrr_at_10
74
+ value: 36.721
75
+ - type: mrr_at_100
76
+ value: 37.937
77
+ - type: mrr_at_1000
78
+ value: 37.96
79
+ - type: mrr_at_3
80
+ value: 32.302
81
+ - type: mrr_at_5
82
+ value: 34.894
83
+ - type: ndcg_at_1
84
+ value: 23.186
85
+ - type: ndcg_at_10
86
+ value: 44.49
87
+ - type: ndcg_at_100
88
+ value: 50.065000000000005
89
+ - type: ndcg_at_1000
90
+ value: 50.629999999999995
91
+ - type: ndcg_at_3
92
+ value: 35.461
93
+ - type: ndcg_at_5
94
+ value: 39.969
95
+ - type: precision_at_1
96
+ value: 23.186
97
+ - type: precision_at_10
98
+ value: 6.97
99
+ - type: precision_at_100
100
+ value: 0.951
101
+ - type: precision_at_1000
102
+ value: 0.099
103
+ - type: precision_at_3
104
+ value: 14.912
105
+ - type: precision_at_5
106
+ value: 11.152
107
+ - type: recall_at_1
108
+ value: 23.186
109
+ - type: recall_at_10
110
+ value: 69.70100000000001
111
+ - type: recall_at_100
112
+ value: 95.092
113
+ - type: recall_at_1000
114
+ value: 99.431
115
+ - type: recall_at_3
116
+ value: 44.737
117
+ - type: recall_at_5
118
+ value: 55.761
119
+ - task:
120
+ type: Clustering
121
+ dataset:
122
+ type: mteb/arxiv-clustering-p2p
123
+ name: MTEB ArxivClusteringP2P
124
+ config: default
125
+ split: test
126
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
127
+ metrics:
128
+ - type: v_measure
129
+ value: 46.10312401440185
130
+ - task:
131
+ type: Clustering
132
+ dataset:
133
+ type: mteb/arxiv-clustering-s2s
134
+ name: MTEB ArxivClusteringS2S
135
+ config: default
136
+ split: test
137
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
138
+ metrics:
139
+ - type: v_measure
140
+ value: 39.67275326095384
141
+ - task:
142
+ type: Reranking
143
+ dataset:
144
+ type: mteb/askubuntudupquestions-reranking
145
+ name: MTEB AskUbuntuDupQuestions
146
+ config: default
147
+ split: test
148
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
149
+ metrics:
150
+ - type: map
151
+ value: 58.97793816337376
152
+ - type: mrr
153
+ value: 72.76832431957087
154
+ - task:
155
+ type: STS
156
+ dataset:
157
+ type: mteb/biosses-sts
158
+ name: MTEB BIOSSES
159
+ config: default
160
+ split: test
161
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
162
+ metrics:
163
+ - type: cos_sim_pearson
164
+ value: 83.11646947018187
165
+ - type: cos_sim_spearman
166
+ value: 81.40064994975234
167
+ - type: euclidean_pearson
168
+ value: 82.37355689019232
169
+ - type: euclidean_spearman
170
+ value: 81.6777646977348
171
+ - type: manhattan_pearson
172
+ value: 82.61101422716945
173
+ - type: manhattan_spearman
174
+ value: 81.80427360442245
175
+ - task:
176
+ type: Classification
177
+ dataset:
178
+ type: mteb/banking77
179
+ name: MTEB Banking77Classification
180
+ config: default
181
+ split: test
182
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
183
+ metrics:
184
+ - type: accuracy
185
+ value: 83.52922077922076
186
+ - type: f1
187
+ value: 83.45298679360866
188
+ - task:
189
+ type: Clustering
190
+ dataset:
191
+ type: mteb/biorxiv-clustering-p2p
192
+ name: MTEB BiorxivClusteringP2P
193
+ config: default
194
+ split: test
195
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
196
+ metrics:
197
+ - type: v_measure
198
+ value: 37.495115019668496
199
+ - task:
200
+ type: Clustering
201
+ dataset:
202
+ type: mteb/biorxiv-clustering-s2s
203
+ name: MTEB BiorxivClusteringS2S
204
+ config: default
205
+ split: test
206
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
207
+ metrics:
208
+ - type: v_measure
209
+ value: 32.724792944166765
210
+ - task:
211
+ type: Retrieval
212
+ dataset:
213
+ type: BeIR/cqadupstack
214
+ name: MTEB CQADupstackAndroidRetrieval
215
+ config: default
216
+ split: test
217
+ revision: None
218
+ metrics:
219
+ - type: map_at_1
220
+ value: 32.361000000000004
221
+ - type: map_at_10
222
+ value: 43.765
223
+ - type: map_at_100
224
+ value: 45.224
225
+ - type: map_at_1000
226
+ value: 45.35
227
+ - type: map_at_3
228
+ value: 40.353
229
+ - type: map_at_5
230
+ value: 42.195
231
+ - type: mrr_at_1
232
+ value: 40.629
233
+ - type: mrr_at_10
234
+ value: 50.458000000000006
235
+ - type: mrr_at_100
236
+ value: 51.06699999999999
237
+ - type: mrr_at_1000
238
+ value: 51.12
239
+ - type: mrr_at_3
240
+ value: 47.902
241
+ - type: mrr_at_5
242
+ value: 49.447
243
+ - type: ndcg_at_1
244
+ value: 40.629
245
+ - type: ndcg_at_10
246
+ value: 50.376
247
+ - type: ndcg_at_100
248
+ value: 55.065
249
+ - type: ndcg_at_1000
250
+ value: 57.196000000000005
251
+ - type: ndcg_at_3
252
+ value: 45.616
253
+ - type: ndcg_at_5
254
+ value: 47.646
255
+ - type: precision_at_1
256
+ value: 40.629
257
+ - type: precision_at_10
258
+ value: 9.785
259
+ - type: precision_at_100
260
+ value: 1.562
261
+ - type: precision_at_1000
262
+ value: 0.2
263
+ - type: precision_at_3
264
+ value: 22.031
265
+ - type: precision_at_5
266
+ value: 15.737000000000002
267
+ - type: recall_at_1
268
+ value: 32.361000000000004
269
+ - type: recall_at_10
270
+ value: 62.214000000000006
271
+ - type: recall_at_100
272
+ value: 81.464
273
+ - type: recall_at_1000
274
+ value: 95.905
275
+ - type: recall_at_3
276
+ value: 47.5
277
+ - type: recall_at_5
278
+ value: 53.69500000000001
279
+ - task:
280
+ type: Retrieval
281
+ dataset:
282
+ type: BeIR/cqadupstack
283
+ name: MTEB CQADupstackEnglishRetrieval
284
+ config: default
285
+ split: test
286
+ revision: None
287
+ metrics:
288
+ - type: map_at_1
289
+ value: 27.971
290
+ - type: map_at_10
291
+ value: 37.444
292
+ - type: map_at_100
293
+ value: 38.607
294
+ - type: map_at_1000
295
+ value: 38.737
296
+ - type: map_at_3
297
+ value: 34.504000000000005
298
+ - type: map_at_5
299
+ value: 36.234
300
+ - type: mrr_at_1
301
+ value: 35.35
302
+ - type: mrr_at_10
303
+ value: 43.441
304
+ - type: mrr_at_100
305
+ value: 44.147999999999996
306
+ - type: mrr_at_1000
307
+ value: 44.196000000000005
308
+ - type: mrr_at_3
309
+ value: 41.285
310
+ - type: mrr_at_5
311
+ value: 42.552
312
+ - type: ndcg_at_1
313
+ value: 35.35
314
+ - type: ndcg_at_10
315
+ value: 42.903999999999996
316
+ - type: ndcg_at_100
317
+ value: 47.406
318
+ - type: ndcg_at_1000
319
+ value: 49.588
320
+ - type: ndcg_at_3
321
+ value: 38.778
322
+ - type: ndcg_at_5
323
+ value: 40.788000000000004
324
+ - type: precision_at_1
325
+ value: 35.35
326
+ - type: precision_at_10
327
+ value: 8.083
328
+ - type: precision_at_100
329
+ value: 1.313
330
+ - type: precision_at_1000
331
+ value: 0.18
332
+ - type: precision_at_3
333
+ value: 18.769
334
+ - type: precision_at_5
335
+ value: 13.439
336
+ - type: recall_at_1
337
+ value: 27.971
338
+ - type: recall_at_10
339
+ value: 52.492000000000004
340
+ - type: recall_at_100
341
+ value: 71.642
342
+ - type: recall_at_1000
343
+ value: 85.488
344
+ - type: recall_at_3
345
+ value: 40.1
346
+ - type: recall_at_5
347
+ value: 45.800000000000004
348
+ - task:
349
+ type: Retrieval
350
+ dataset:
351
+ type: BeIR/cqadupstack
352
+ name: MTEB CQADupstackGamingRetrieval
353
+ config: default
354
+ split: test
355
+ revision: None
356
+ metrics:
357
+ - type: map_at_1
358
+ value: 39.898
359
+ - type: map_at_10
360
+ value: 51.819
361
+ - type: map_at_100
362
+ value: 52.886
363
+ - type: map_at_1000
364
+ value: 52.941
365
+ - type: map_at_3
366
+ value: 48.619
367
+ - type: map_at_5
368
+ value: 50.493
369
+ - type: mrr_at_1
370
+ value: 45.391999999999996
371
+ - type: mrr_at_10
372
+ value: 55.230000000000004
373
+ - type: mrr_at_100
374
+ value: 55.887
375
+ - type: mrr_at_1000
376
+ value: 55.916
377
+ - type: mrr_at_3
378
+ value: 52.717000000000006
379
+ - type: mrr_at_5
380
+ value: 54.222
381
+ - type: ndcg_at_1
382
+ value: 45.391999999999996
383
+ - type: ndcg_at_10
384
+ value: 57.586999999999996
385
+ - type: ndcg_at_100
386
+ value: 61.745000000000005
387
+ - type: ndcg_at_1000
388
+ value: 62.83800000000001
389
+ - type: ndcg_at_3
390
+ value: 52.207
391
+ - type: ndcg_at_5
392
+ value: 54.925999999999995
393
+ - type: precision_at_1
394
+ value: 45.391999999999996
395
+ - type: precision_at_10
396
+ value: 9.21
397
+ - type: precision_at_100
398
+ value: 1.226
399
+ - type: precision_at_1000
400
+ value: 0.136
401
+ - type: precision_at_3
402
+ value: 23.177
403
+ - type: precision_at_5
404
+ value: 16.038
405
+ - type: recall_at_1
406
+ value: 39.898
407
+ - type: recall_at_10
408
+ value: 71.18900000000001
409
+ - type: recall_at_100
410
+ value: 89.082
411
+ - type: recall_at_1000
412
+ value: 96.865
413
+ - type: recall_at_3
414
+ value: 56.907
415
+ - type: recall_at_5
416
+ value: 63.397999999999996
417
+ - task:
418
+ type: Retrieval
419
+ dataset:
420
+ type: BeIR/cqadupstack
421
+ name: MTEB CQADupstackGisRetrieval
422
+ config: default
423
+ split: test
424
+ revision: None
425
+ metrics:
426
+ - type: map_at_1
427
+ value: 22.706
428
+ - type: map_at_10
429
+ value: 30.818
430
+ - type: map_at_100
431
+ value: 32.038
432
+ - type: map_at_1000
433
+ value: 32.123000000000005
434
+ - type: map_at_3
435
+ value: 28.077
436
+ - type: map_at_5
437
+ value: 29.709999999999997
438
+ - type: mrr_at_1
439
+ value: 24.407
440
+ - type: mrr_at_10
441
+ value: 32.555
442
+ - type: mrr_at_100
443
+ value: 33.692
444
+ - type: mrr_at_1000
445
+ value: 33.751
446
+ - type: mrr_at_3
447
+ value: 29.848999999999997
448
+ - type: mrr_at_5
449
+ value: 31.509999999999998
450
+ - type: ndcg_at_1
451
+ value: 24.407
452
+ - type: ndcg_at_10
453
+ value: 35.624
454
+ - type: ndcg_at_100
455
+ value: 41.454
456
+ - type: ndcg_at_1000
457
+ value: 43.556
458
+ - type: ndcg_at_3
459
+ value: 30.217
460
+ - type: ndcg_at_5
461
+ value: 33.111000000000004
462
+ - type: precision_at_1
463
+ value: 24.407
464
+ - type: precision_at_10
465
+ value: 5.548
466
+ - type: precision_at_100
467
+ value: 0.8869999999999999
468
+ - type: precision_at_1000
469
+ value: 0.11100000000000002
470
+ - type: precision_at_3
471
+ value: 12.731
472
+ - type: precision_at_5
473
+ value: 9.22
474
+ - type: recall_at_1
475
+ value: 22.706
476
+ - type: recall_at_10
477
+ value: 48.772
478
+ - type: recall_at_100
479
+ value: 75.053
480
+ - type: recall_at_1000
481
+ value: 90.731
482
+ - type: recall_at_3
483
+ value: 34.421
484
+ - type: recall_at_5
485
+ value: 41.427
486
+ - task:
487
+ type: Retrieval
488
+ dataset:
489
+ type: BeIR/cqadupstack
490
+ name: MTEB CQADupstackMathematicaRetrieval
491
+ config: default
492
+ split: test
493
+ revision: None
494
+ metrics:
495
+ - type: map_at_1
496
+ value: 13.424
497
+ - type: map_at_10
498
+ value: 21.09
499
+ - type: map_at_100
500
+ value: 22.264999999999997
501
+ - type: map_at_1000
502
+ value: 22.402
503
+ - type: map_at_3
504
+ value: 18.312
505
+ - type: map_at_5
506
+ value: 19.874
507
+ - type: mrr_at_1
508
+ value: 16.915
509
+ - type: mrr_at_10
510
+ value: 25.258000000000003
511
+ - type: mrr_at_100
512
+ value: 26.228
513
+ - type: mrr_at_1000
514
+ value: 26.31
515
+ - type: mrr_at_3
516
+ value: 22.492
517
+ - type: mrr_at_5
518
+ value: 24.04
519
+ - type: ndcg_at_1
520
+ value: 16.915
521
+ - type: ndcg_at_10
522
+ value: 26.266000000000002
523
+ - type: ndcg_at_100
524
+ value: 32.08
525
+ - type: ndcg_at_1000
526
+ value: 35.086
527
+ - type: ndcg_at_3
528
+ value: 21.049
529
+ - type: ndcg_at_5
530
+ value: 23.508000000000003
531
+ - type: precision_at_1
532
+ value: 16.915
533
+ - type: precision_at_10
534
+ value: 5.1
535
+ - type: precision_at_100
536
+ value: 0.9329999999999999
537
+ - type: precision_at_1000
538
+ value: 0.131
539
+ - type: precision_at_3
540
+ value: 10.282
541
+ - type: precision_at_5
542
+ value: 7.836
543
+ - type: recall_at_1
544
+ value: 13.424
545
+ - type: recall_at_10
546
+ value: 38.179
547
+ - type: recall_at_100
548
+ value: 63.906
549
+ - type: recall_at_1000
550
+ value: 84.933
551
+ - type: recall_at_3
552
+ value: 23.878
553
+ - type: recall_at_5
554
+ value: 30.037999999999997
555
+ - task:
556
+ type: Retrieval
557
+ dataset:
558
+ type: BeIR/cqadupstack
559
+ name: MTEB CQADupstackPhysicsRetrieval
560
+ config: default
561
+ split: test
562
+ revision: None
563
+ metrics:
564
+ - type: map_at_1
565
+ value: 26.154
566
+ - type: map_at_10
567
+ value: 35.912
568
+ - type: map_at_100
569
+ value: 37.211
570
+ - type: map_at_1000
571
+ value: 37.327
572
+ - type: map_at_3
573
+ value: 32.684999999999995
574
+ - type: map_at_5
575
+ value: 34.562
576
+ - type: mrr_at_1
577
+ value: 32.435
578
+ - type: mrr_at_10
579
+ value: 41.411
580
+ - type: mrr_at_100
581
+ value: 42.297000000000004
582
+ - type: mrr_at_1000
583
+ value: 42.345
584
+ - type: mrr_at_3
585
+ value: 38.771
586
+ - type: mrr_at_5
587
+ value: 40.33
588
+ - type: ndcg_at_1
589
+ value: 32.435
590
+ - type: ndcg_at_10
591
+ value: 41.785
592
+ - type: ndcg_at_100
593
+ value: 47.469
594
+ - type: ndcg_at_1000
595
+ value: 49.685
596
+ - type: ndcg_at_3
597
+ value: 36.618
598
+ - type: ndcg_at_5
599
+ value: 39.101
600
+ - type: precision_at_1
601
+ value: 32.435
602
+ - type: precision_at_10
603
+ value: 7.642
604
+ - type: precision_at_100
605
+ value: 1.244
606
+ - type: precision_at_1000
607
+ value: 0.163
608
+ - type: precision_at_3
609
+ value: 17.485
610
+ - type: precision_at_5
611
+ value: 12.57
612
+ - type: recall_at_1
613
+ value: 26.154
614
+ - type: recall_at_10
615
+ value: 54.111
616
+ - type: recall_at_100
617
+ value: 78.348
618
+ - type: recall_at_1000
619
+ value: 92.996
620
+ - type: recall_at_3
621
+ value: 39.189
622
+ - type: recall_at_5
623
+ value: 45.852
624
+ - task:
625
+ type: Retrieval
626
+ dataset:
627
+ type: BeIR/cqadupstack
628
+ name: MTEB CQADupstackProgrammersRetrieval
629
+ config: default
630
+ split: test
631
+ revision: None
632
+ metrics:
633
+ - type: map_at_1
634
+ value: 26.308999999999997
635
+ - type: map_at_10
636
+ value: 35.524
637
+ - type: map_at_100
638
+ value: 36.774
639
+ - type: map_at_1000
640
+ value: 36.891
641
+ - type: map_at_3
642
+ value: 32.561
643
+ - type: map_at_5
644
+ value: 34.034
645
+ - type: mrr_at_1
646
+ value: 31.735000000000003
647
+ - type: mrr_at_10
648
+ value: 40.391
649
+ - type: mrr_at_100
650
+ value: 41.227000000000004
651
+ - type: mrr_at_1000
652
+ value: 41.288000000000004
653
+ - type: mrr_at_3
654
+ value: 37.938
655
+ - type: mrr_at_5
656
+ value: 39.193
657
+ - type: ndcg_at_1
658
+ value: 31.735000000000003
659
+ - type: ndcg_at_10
660
+ value: 41.166000000000004
661
+ - type: ndcg_at_100
662
+ value: 46.702
663
+ - type: ndcg_at_1000
664
+ value: 49.157000000000004
665
+ - type: ndcg_at_3
666
+ value: 36.274
667
+ - type: ndcg_at_5
668
+ value: 38.177
669
+ - type: precision_at_1
670
+ value: 31.735000000000003
671
+ - type: precision_at_10
672
+ value: 7.5569999999999995
673
+ - type: precision_at_100
674
+ value: 1.2109999999999999
675
+ - type: precision_at_1000
676
+ value: 0.16
677
+ - type: precision_at_3
678
+ value: 17.199
679
+ - type: precision_at_5
680
+ value: 12.123000000000001
681
+ - type: recall_at_1
682
+ value: 26.308999999999997
683
+ - type: recall_at_10
684
+ value: 53.083000000000006
685
+ - type: recall_at_100
686
+ value: 76.922
687
+ - type: recall_at_1000
688
+ value: 93.767
689
+ - type: recall_at_3
690
+ value: 39.262
691
+ - type: recall_at_5
692
+ value: 44.413000000000004
693
+ - task:
694
+ type: Retrieval
695
+ dataset:
696
+ type: BeIR/cqadupstack
697
+ name: MTEB CQADupstackRetrieval
698
+ config: default
699
+ split: test
700
+ revision: None
701
+ metrics:
702
+ - type: map_at_1
703
+ value: 24.391250000000003
704
+ - type: map_at_10
705
+ value: 33.280166666666666
706
+ - type: map_at_100
707
+ value: 34.49566666666667
708
+ - type: map_at_1000
709
+ value: 34.61533333333333
710
+ - type: map_at_3
711
+ value: 30.52183333333333
712
+ - type: map_at_5
713
+ value: 32.06608333333333
714
+ - type: mrr_at_1
715
+ value: 29.105083333333337
716
+ - type: mrr_at_10
717
+ value: 37.44766666666666
718
+ - type: mrr_at_100
719
+ value: 38.32491666666667
720
+ - type: mrr_at_1000
721
+ value: 38.385666666666665
722
+ - type: mrr_at_3
723
+ value: 35.06883333333333
724
+ - type: mrr_at_5
725
+ value: 36.42066666666667
726
+ - type: ndcg_at_1
727
+ value: 29.105083333333337
728
+ - type: ndcg_at_10
729
+ value: 38.54358333333333
730
+ - type: ndcg_at_100
731
+ value: 43.833583333333344
732
+ - type: ndcg_at_1000
733
+ value: 46.215333333333334
734
+ - type: ndcg_at_3
735
+ value: 33.876
736
+ - type: ndcg_at_5
737
+ value: 36.05208333333333
738
+ - type: precision_at_1
739
+ value: 29.105083333333337
740
+ - type: precision_at_10
741
+ value: 6.823416666666665
742
+ - type: precision_at_100
743
+ value: 1.1270833333333334
744
+ - type: precision_at_1000
745
+ value: 0.15208333333333332
746
+ - type: precision_at_3
747
+ value: 15.696750000000002
748
+ - type: precision_at_5
749
+ value: 11.193499999999998
750
+ - type: recall_at_1
751
+ value: 24.391250000000003
752
+ - type: recall_at_10
753
+ value: 49.98808333333333
754
+ - type: recall_at_100
755
+ value: 73.31616666666666
756
+ - type: recall_at_1000
757
+ value: 89.96291666666667
758
+ - type: recall_at_3
759
+ value: 36.86666666666667
760
+ - type: recall_at_5
761
+ value: 42.54350000000001
762
+ - task:
763
+ type: Retrieval
764
+ dataset:
765
+ type: BeIR/cqadupstack
766
+ name: MTEB CQADupstackStatsRetrieval
767
+ config: default
768
+ split: test
769
+ revision: None
770
+ metrics:
771
+ - type: map_at_1
772
+ value: 21.995
773
+ - type: map_at_10
774
+ value: 28.807
775
+ - type: map_at_100
776
+ value: 29.813000000000002
777
+ - type: map_at_1000
778
+ value: 29.903000000000002
779
+ - type: map_at_3
780
+ value: 26.636
781
+ - type: map_at_5
782
+ value: 27.912
783
+ - type: mrr_at_1
784
+ value: 24.847
785
+ - type: mrr_at_10
786
+ value: 31.494
787
+ - type: mrr_at_100
788
+ value: 32.381
789
+ - type: mrr_at_1000
790
+ value: 32.446999999999996
791
+ - type: mrr_at_3
792
+ value: 29.473
793
+ - type: mrr_at_5
794
+ value: 30.7
795
+ - type: ndcg_at_1
796
+ value: 24.847
797
+ - type: ndcg_at_10
798
+ value: 32.818999999999996
799
+ - type: ndcg_at_100
800
+ value: 37.835
801
+ - type: ndcg_at_1000
802
+ value: 40.226
803
+ - type: ndcg_at_3
804
+ value: 28.811999999999998
805
+ - type: ndcg_at_5
806
+ value: 30.875999999999998
807
+ - type: precision_at_1
808
+ value: 24.847
809
+ - type: precision_at_10
810
+ value: 5.244999999999999
811
+ - type: precision_at_100
812
+ value: 0.856
813
+ - type: precision_at_1000
814
+ value: 0.11299999999999999
815
+ - type: precision_at_3
816
+ value: 12.577
817
+ - type: precision_at_5
818
+ value: 8.895999999999999
819
+ - type: recall_at_1
820
+ value: 21.995
821
+ - type: recall_at_10
822
+ value: 42.479
823
+ - type: recall_at_100
824
+ value: 65.337
825
+ - type: recall_at_1000
826
+ value: 83.23700000000001
827
+ - type: recall_at_3
828
+ value: 31.573
829
+ - type: recall_at_5
830
+ value: 36.684
831
+ - task:
832
+ type: Retrieval
833
+ dataset:
834
+ type: BeIR/cqadupstack
835
+ name: MTEB CQADupstackTexRetrieval
836
+ config: default
837
+ split: test
838
+ revision: None
839
+ metrics:
840
+ - type: map_at_1
841
+ value: 15.751000000000001
842
+ - type: map_at_10
843
+ value: 21.909
844
+ - type: map_at_100
845
+ value: 23.064
846
+ - type: map_at_1000
847
+ value: 23.205000000000002
848
+ - type: map_at_3
849
+ value: 20.138
850
+ - type: map_at_5
851
+ value: 20.973
852
+ - type: mrr_at_1
853
+ value: 19.305
854
+ - type: mrr_at_10
855
+ value: 25.647
856
+ - type: mrr_at_100
857
+ value: 26.659
858
+ - type: mrr_at_1000
859
+ value: 26.748
860
+ - type: mrr_at_3
861
+ value: 23.933
862
+ - type: mrr_at_5
863
+ value: 24.754
864
+ - type: ndcg_at_1
865
+ value: 19.305
866
+ - type: ndcg_at_10
867
+ value: 25.886
868
+ - type: ndcg_at_100
869
+ value: 31.56
870
+ - type: ndcg_at_1000
871
+ value: 34.799
872
+ - type: ndcg_at_3
873
+ value: 22.708000000000002
874
+ - type: ndcg_at_5
875
+ value: 23.838
876
+ - type: precision_at_1
877
+ value: 19.305
878
+ - type: precision_at_10
879
+ value: 4.677
880
+ - type: precision_at_100
881
+ value: 0.895
882
+ - type: precision_at_1000
883
+ value: 0.136
884
+ - type: precision_at_3
885
+ value: 10.771
886
+ - type: precision_at_5
887
+ value: 7.46
888
+ - type: recall_at_1
889
+ value: 15.751000000000001
890
+ - type: recall_at_10
891
+ value: 34.156
892
+ - type: recall_at_100
893
+ value: 59.899
894
+ - type: recall_at_1000
895
+ value: 83.08
896
+ - type: recall_at_3
897
+ value: 24.772
898
+ - type: recall_at_5
899
+ value: 28.009
900
+ - task:
901
+ type: Retrieval
902
+ dataset:
903
+ type: BeIR/cqadupstack
904
+ name: MTEB CQADupstackUnixRetrieval
905
+ config: default
906
+ split: test
907
+ revision: None
908
+ metrics:
909
+ - type: map_at_1
910
+ value: 23.34
911
+ - type: map_at_10
912
+ value: 32.383
913
+ - type: map_at_100
914
+ value: 33.629999999999995
915
+ - type: map_at_1000
916
+ value: 33.735
917
+ - type: map_at_3
918
+ value: 29.68
919
+ - type: map_at_5
920
+ value: 31.270999999999997
921
+ - type: mrr_at_1
922
+ value: 27.612
923
+ - type: mrr_at_10
924
+ value: 36.381
925
+ - type: mrr_at_100
926
+ value: 37.351
927
+ - type: mrr_at_1000
928
+ value: 37.411
929
+ - type: mrr_at_3
930
+ value: 33.893
931
+ - type: mrr_at_5
932
+ value: 35.353
933
+ - type: ndcg_at_1
934
+ value: 27.612
935
+ - type: ndcg_at_10
936
+ value: 37.714999999999996
937
+ - type: ndcg_at_100
938
+ value: 43.525000000000006
939
+ - type: ndcg_at_1000
940
+ value: 45.812999999999995
941
+ - type: ndcg_at_3
942
+ value: 32.796
943
+ - type: ndcg_at_5
944
+ value: 35.243
945
+ - type: precision_at_1
946
+ value: 27.612
947
+ - type: precision_at_10
948
+ value: 6.465
949
+ - type: precision_at_100
950
+ value: 1.0619999999999998
951
+ - type: precision_at_1000
952
+ value: 0.13699999999999998
953
+ - type: precision_at_3
954
+ value: 15.049999999999999
955
+ - type: precision_at_5
956
+ value: 10.764999999999999
957
+ - type: recall_at_1
958
+ value: 23.34
959
+ - type: recall_at_10
960
+ value: 49.856
961
+ - type: recall_at_100
962
+ value: 75.334
963
+ - type: recall_at_1000
964
+ value: 91.156
965
+ - type: recall_at_3
966
+ value: 36.497
967
+ - type: recall_at_5
968
+ value: 42.769
969
+ - task:
970
+ type: Retrieval
971
+ dataset:
972
+ type: BeIR/cqadupstack
973
+ name: MTEB CQADupstackWebmastersRetrieval
974
+ config: default
975
+ split: test
976
+ revision: None
977
+ metrics:
978
+ - type: map_at_1
979
+ value: 25.097
980
+ - type: map_at_10
981
+ value: 34.599999999999994
982
+ - type: map_at_100
983
+ value: 36.174
984
+ - type: map_at_1000
985
+ value: 36.398
986
+ - type: map_at_3
987
+ value: 31.781
988
+ - type: map_at_5
989
+ value: 33.22
990
+ - type: mrr_at_1
991
+ value: 31.225
992
+ - type: mrr_at_10
993
+ value: 39.873
994
+ - type: mrr_at_100
995
+ value: 40.853
996
+ - type: mrr_at_1000
997
+ value: 40.904
998
+ - type: mrr_at_3
999
+ value: 37.681
1000
+ - type: mrr_at_5
1001
+ value: 38.669
1002
+ - type: ndcg_at_1
1003
+ value: 31.225
1004
+ - type: ndcg_at_10
1005
+ value: 40.586
1006
+ - type: ndcg_at_100
1007
+ value: 46.226
1008
+ - type: ndcg_at_1000
1009
+ value: 48.788
1010
+ - type: ndcg_at_3
1011
+ value: 36.258
1012
+ - type: ndcg_at_5
1013
+ value: 37.848
1014
+ - type: precision_at_1
1015
+ value: 31.225
1016
+ - type: precision_at_10
1017
+ value: 7.707999999999999
1018
+ - type: precision_at_100
1019
+ value: 1.536
1020
+ - type: precision_at_1000
1021
+ value: 0.242
1022
+ - type: precision_at_3
1023
+ value: 17.26
1024
+ - type: precision_at_5
1025
+ value: 12.253
1026
+ - type: recall_at_1
1027
+ value: 25.097
1028
+ - type: recall_at_10
1029
+ value: 51.602000000000004
1030
+ - type: recall_at_100
1031
+ value: 76.854
1032
+ - type: recall_at_1000
1033
+ value: 93.303
1034
+ - type: recall_at_3
1035
+ value: 38.68
1036
+ - type: recall_at_5
1037
+ value: 43.258
1038
+ - task:
1039
+ type: Retrieval
1040
+ dataset:
1041
+ type: BeIR/cqadupstack
1042
+ name: MTEB CQADupstackWordpressRetrieval
1043
+ config: default
1044
+ split: test
1045
+ revision: None
1046
+ metrics:
1047
+ - type: map_at_1
1048
+ value: 17.689
1049
+ - type: map_at_10
1050
+ value: 25.291000000000004
1051
+ - type: map_at_100
1052
+ value: 26.262
1053
+ - type: map_at_1000
1054
+ value: 26.372
1055
+ - type: map_at_3
1056
+ value: 22.916
1057
+ - type: map_at_5
1058
+ value: 24.315
1059
+ - type: mrr_at_1
1060
+ value: 19.409000000000002
1061
+ - type: mrr_at_10
1062
+ value: 27.233
1063
+ - type: mrr_at_100
1064
+ value: 28.109
1065
+ - type: mrr_at_1000
1066
+ value: 28.192
1067
+ - type: mrr_at_3
1068
+ value: 24.892
1069
+ - type: mrr_at_5
1070
+ value: 26.278000000000002
1071
+ - type: ndcg_at_1
1072
+ value: 19.409000000000002
1073
+ - type: ndcg_at_10
1074
+ value: 29.809
1075
+ - type: ndcg_at_100
1076
+ value: 34.936
1077
+ - type: ndcg_at_1000
1078
+ value: 37.852000000000004
1079
+ - type: ndcg_at_3
1080
+ value: 25.179000000000002
1081
+ - type: ndcg_at_5
1082
+ value: 27.563
1083
+ - type: precision_at_1
1084
+ value: 19.409000000000002
1085
+ - type: precision_at_10
1086
+ value: 4.861
1087
+ - type: precision_at_100
1088
+ value: 0.8
1089
+ - type: precision_at_1000
1090
+ value: 0.116
1091
+ - type: precision_at_3
1092
+ value: 11.029
1093
+ - type: precision_at_5
1094
+ value: 7.985
1095
+ - type: recall_at_1
1096
+ value: 17.689
1097
+ - type: recall_at_10
1098
+ value: 41.724
1099
+ - type: recall_at_100
1100
+ value: 65.95299999999999
1101
+ - type: recall_at_1000
1102
+ value: 88.094
1103
+ - type: recall_at_3
1104
+ value: 29.621
1105
+ - type: recall_at_5
1106
+ value: 35.179
1107
+ - task:
1108
+ type: Retrieval
1109
+ dataset:
1110
+ type: climate-fever
1111
+ name: MTEB ClimateFEVER
1112
+ config: default
1113
+ split: test
1114
+ revision: None
1115
+ metrics:
1116
+ - type: map_at_1
1117
+ value: 10.581
1118
+ - type: map_at_10
1119
+ value: 18.944
1120
+ - type: map_at_100
1121
+ value: 20.812
1122
+ - type: map_at_1000
1123
+ value: 21.002000000000002
1124
+ - type: map_at_3
1125
+ value: 15.661
1126
+ - type: map_at_5
1127
+ value: 17.502000000000002
1128
+ - type: mrr_at_1
1129
+ value: 23.388
1130
+ - type: mrr_at_10
1131
+ value: 34.263
1132
+ - type: mrr_at_100
1133
+ value: 35.364000000000004
1134
+ - type: mrr_at_1000
1135
+ value: 35.409
1136
+ - type: mrr_at_3
1137
+ value: 30.586000000000002
1138
+ - type: mrr_at_5
1139
+ value: 32.928000000000004
1140
+ - type: ndcg_at_1
1141
+ value: 23.388
1142
+ - type: ndcg_at_10
1143
+ value: 26.56
1144
+ - type: ndcg_at_100
1145
+ value: 34.248
1146
+ - type: ndcg_at_1000
1147
+ value: 37.779
1148
+ - type: ndcg_at_3
1149
+ value: 21.179000000000002
1150
+ - type: ndcg_at_5
1151
+ value: 23.504
1152
+ - type: precision_at_1
1153
+ value: 23.388
1154
+ - type: precision_at_10
1155
+ value: 8.476
1156
+ - type: precision_at_100
1157
+ value: 1.672
1158
+ - type: precision_at_1000
1159
+ value: 0.233
1160
+ - type: precision_at_3
1161
+ value: 15.852
1162
+ - type: precision_at_5
1163
+ value: 12.73
1164
+ - type: recall_at_1
1165
+ value: 10.581
1166
+ - type: recall_at_10
1167
+ value: 32.512
1168
+ - type: recall_at_100
1169
+ value: 59.313
1170
+ - type: recall_at_1000
1171
+ value: 79.25
1172
+ - type: recall_at_3
1173
+ value: 19.912
1174
+ - type: recall_at_5
1175
+ value: 25.832
1176
+ - task:
1177
+ type: Retrieval
1178
+ dataset:
1179
+ type: dbpedia-entity
1180
+ name: MTEB DBPedia
1181
+ config: default
1182
+ split: test
1183
+ revision: None
1184
+ metrics:
1185
+ - type: map_at_1
1186
+ value: 9.35
1187
+ - type: map_at_10
1188
+ value: 20.134
1189
+ - type: map_at_100
1190
+ value: 28.975
1191
+ - type: map_at_1000
1192
+ value: 30.709999999999997
1193
+ - type: map_at_3
1194
+ value: 14.513000000000002
1195
+ - type: map_at_5
1196
+ value: 16.671
1197
+ - type: mrr_at_1
1198
+ value: 69.75
1199
+ - type: mrr_at_10
1200
+ value: 77.67699999999999
1201
+ - type: mrr_at_100
1202
+ value: 77.97500000000001
1203
+ - type: mrr_at_1000
1204
+ value: 77.985
1205
+ - type: mrr_at_3
1206
+ value: 76.292
1207
+ - type: mrr_at_5
1208
+ value: 77.179
1209
+ - type: ndcg_at_1
1210
+ value: 56.49999999999999
1211
+ - type: ndcg_at_10
1212
+ value: 42.226
1213
+ - type: ndcg_at_100
1214
+ value: 47.562
1215
+ - type: ndcg_at_1000
1216
+ value: 54.923
1217
+ - type: ndcg_at_3
1218
+ value: 46.564
1219
+ - type: ndcg_at_5
1220
+ value: 43.830000000000005
1221
+ - type: precision_at_1
1222
+ value: 69.75
1223
+ - type: precision_at_10
1224
+ value: 33.525
1225
+ - type: precision_at_100
1226
+ value: 11.035
1227
+ - type: precision_at_1000
1228
+ value: 2.206
1229
+ - type: precision_at_3
1230
+ value: 49.75
1231
+ - type: precision_at_5
1232
+ value: 42
1233
+ - type: recall_at_1
1234
+ value: 9.35
1235
+ - type: recall_at_10
1236
+ value: 25.793
1237
+ - type: recall_at_100
1238
+ value: 54.186
1239
+ - type: recall_at_1000
1240
+ value: 77.81
1241
+ - type: recall_at_3
1242
+ value: 15.770000000000001
1243
+ - type: recall_at_5
1244
+ value: 19.09
1245
+ - task:
1246
+ type: Classification
1247
+ dataset:
1248
+ type: mteb/emotion
1249
+ name: MTEB EmotionClassification
1250
+ config: default
1251
+ split: test
1252
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1253
+ metrics:
1254
+ - type: accuracy
1255
+ value: 46.945
1256
+ - type: f1
1257
+ value: 42.07407842992542
1258
+ - task:
1259
+ type: Retrieval
1260
+ dataset:
1261
+ type: fever
1262
+ name: MTEB FEVER
1263
+ config: default
1264
+ split: test
1265
+ revision: None
1266
+ metrics:
1267
+ - type: map_at_1
1268
+ value: 71.04599999999999
1269
+ - type: map_at_10
1270
+ value: 80.718
1271
+ - type: map_at_100
1272
+ value: 80.961
1273
+ - type: map_at_1000
1274
+ value: 80.974
1275
+ - type: map_at_3
1276
+ value: 79.49199999999999
1277
+ - type: map_at_5
1278
+ value: 80.32000000000001
1279
+ - type: mrr_at_1
1280
+ value: 76.388
1281
+ - type: mrr_at_10
1282
+ value: 85.214
1283
+ - type: mrr_at_100
1284
+ value: 85.302
1285
+ - type: mrr_at_1000
1286
+ value: 85.302
1287
+ - type: mrr_at_3
1288
+ value: 84.373
1289
+ - type: mrr_at_5
1290
+ value: 84.979
1291
+ - type: ndcg_at_1
1292
+ value: 76.388
1293
+ - type: ndcg_at_10
1294
+ value: 84.987
1295
+ - type: ndcg_at_100
1296
+ value: 85.835
1297
+ - type: ndcg_at_1000
1298
+ value: 86.04899999999999
1299
+ - type: ndcg_at_3
1300
+ value: 83.04
1301
+ - type: ndcg_at_5
1302
+ value: 84.22500000000001
1303
+ - type: precision_at_1
1304
+ value: 76.388
1305
+ - type: precision_at_10
1306
+ value: 10.35
1307
+ - type: precision_at_100
1308
+ value: 1.099
1309
+ - type: precision_at_1000
1310
+ value: 0.11399999999999999
1311
+ - type: precision_at_3
1312
+ value: 32.108
1313
+ - type: precision_at_5
1314
+ value: 20.033
1315
+ - type: recall_at_1
1316
+ value: 71.04599999999999
1317
+ - type: recall_at_10
1318
+ value: 93.547
1319
+ - type: recall_at_100
1320
+ value: 96.887
1321
+ - type: recall_at_1000
1322
+ value: 98.158
1323
+ - type: recall_at_3
1324
+ value: 88.346
1325
+ - type: recall_at_5
1326
+ value: 91.321
1327
+ - task:
1328
+ type: Retrieval
1329
+ dataset:
1330
+ type: fiqa
1331
+ name: MTEB FiQA2018
1332
+ config: default
1333
+ split: test
1334
+ revision: None
1335
+ metrics:
1336
+ - type: map_at_1
1337
+ value: 19.8
1338
+ - type: map_at_10
1339
+ value: 31.979999999999997
1340
+ - type: map_at_100
1341
+ value: 33.876
1342
+ - type: map_at_1000
1343
+ value: 34.056999999999995
1344
+ - type: map_at_3
1345
+ value: 28.067999999999998
1346
+ - type: map_at_5
1347
+ value: 30.066
1348
+ - type: mrr_at_1
1349
+ value: 38.735
1350
+ - type: mrr_at_10
1351
+ value: 47.749
1352
+ - type: mrr_at_100
1353
+ value: 48.605
1354
+ - type: mrr_at_1000
1355
+ value: 48.644999999999996
1356
+ - type: mrr_at_3
1357
+ value: 45.165
1358
+ - type: mrr_at_5
1359
+ value: 46.646
1360
+ - type: ndcg_at_1
1361
+ value: 38.735
1362
+ - type: ndcg_at_10
1363
+ value: 39.883
1364
+ - type: ndcg_at_100
1365
+ value: 46.983000000000004
1366
+ - type: ndcg_at_1000
1367
+ value: 50.043000000000006
1368
+ - type: ndcg_at_3
1369
+ value: 35.943000000000005
1370
+ - type: ndcg_at_5
1371
+ value: 37.119
1372
+ - type: precision_at_1
1373
+ value: 38.735
1374
+ - type: precision_at_10
1375
+ value: 10.940999999999999
1376
+ - type: precision_at_100
1377
+ value: 1.836
1378
+ - type: precision_at_1000
1379
+ value: 0.23900000000000002
1380
+ - type: precision_at_3
1381
+ value: 23.817
1382
+ - type: precision_at_5
1383
+ value: 17.346
1384
+ - type: recall_at_1
1385
+ value: 19.8
1386
+ - type: recall_at_10
1387
+ value: 47.082
1388
+ - type: recall_at_100
1389
+ value: 73.247
1390
+ - type: recall_at_1000
1391
+ value: 91.633
1392
+ - type: recall_at_3
1393
+ value: 33.201
1394
+ - type: recall_at_5
1395
+ value: 38.81
1396
+ - task:
1397
+ type: Retrieval
1398
+ dataset:
1399
+ type: hotpotqa
1400
+ name: MTEB HotpotQA
1401
+ config: default
1402
+ split: test
1403
+ revision: None
1404
+ metrics:
1405
+ - type: map_at_1
1406
+ value: 38.102999999999994
1407
+ - type: map_at_10
1408
+ value: 60.547
1409
+ - type: map_at_100
1410
+ value: 61.466
1411
+ - type: map_at_1000
1412
+ value: 61.526
1413
+ - type: map_at_3
1414
+ value: 56.973
1415
+ - type: map_at_5
1416
+ value: 59.244
1417
+ - type: mrr_at_1
1418
+ value: 76.205
1419
+ - type: mrr_at_10
1420
+ value: 82.816
1421
+ - type: mrr_at_100
1422
+ value: 83.002
1423
+ - type: mrr_at_1000
1424
+ value: 83.009
1425
+ - type: mrr_at_3
1426
+ value: 81.747
1427
+ - type: mrr_at_5
1428
+ value: 82.467
1429
+ - type: ndcg_at_1
1430
+ value: 76.205
1431
+ - type: ndcg_at_10
1432
+ value: 69.15
1433
+ - type: ndcg_at_100
1434
+ value: 72.297
1435
+ - type: ndcg_at_1000
1436
+ value: 73.443
1437
+ - type: ndcg_at_3
1438
+ value: 64.07000000000001
1439
+ - type: ndcg_at_5
1440
+ value: 66.96600000000001
1441
+ - type: precision_at_1
1442
+ value: 76.205
1443
+ - type: precision_at_10
1444
+ value: 14.601
1445
+ - type: precision_at_100
1446
+ value: 1.7049999999999998
1447
+ - type: precision_at_1000
1448
+ value: 0.186
1449
+ - type: precision_at_3
1450
+ value: 41.202
1451
+ - type: precision_at_5
1452
+ value: 27.006000000000004
1453
+ - type: recall_at_1
1454
+ value: 38.102999999999994
1455
+ - type: recall_at_10
1456
+ value: 73.005
1457
+ - type: recall_at_100
1458
+ value: 85.253
1459
+ - type: recall_at_1000
1460
+ value: 92.795
1461
+ - type: recall_at_3
1462
+ value: 61.803
1463
+ - type: recall_at_5
1464
+ value: 67.515
1465
+ - task:
1466
+ type: Classification
1467
+ dataset:
1468
+ type: mteb/imdb
1469
+ name: MTEB ImdbClassification
1470
+ config: default
1471
+ split: test
1472
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1473
+ metrics:
1474
+ - type: accuracy
1475
+ value: 86.15
1476
+ - type: ap
1477
+ value: 80.36282825265391
1478
+ - type: f1
1479
+ value: 86.07368510726472
1480
+ - task:
1481
+ type: Retrieval
1482
+ dataset:
1483
+ type: msmarco
1484
+ name: MTEB MSMARCO
1485
+ config: default
1486
+ split: dev
1487
+ revision: None
1488
+ metrics:
1489
+ - type: map_at_1
1490
+ value: 22.6
1491
+ - type: map_at_10
1492
+ value: 34.887
1493
+ - type: map_at_100
1494
+ value: 36.069
1495
+ - type: map_at_1000
1496
+ value: 36.115
1497
+ - type: map_at_3
1498
+ value: 31.067
1499
+ - type: map_at_5
1500
+ value: 33.300000000000004
1501
+ - type: mrr_at_1
1502
+ value: 23.238
1503
+ - type: mrr_at_10
1504
+ value: 35.47
1505
+ - type: mrr_at_100
1506
+ value: 36.599
1507
+ - type: mrr_at_1000
1508
+ value: 36.64
1509
+ - type: mrr_at_3
1510
+ value: 31.735999999999997
1511
+ - type: mrr_at_5
1512
+ value: 33.939
1513
+ - type: ndcg_at_1
1514
+ value: 23.252
1515
+ - type: ndcg_at_10
1516
+ value: 41.765
1517
+ - type: ndcg_at_100
1518
+ value: 47.402
1519
+ - type: ndcg_at_1000
1520
+ value: 48.562
1521
+ - type: ndcg_at_3
1522
+ value: 34.016999999999996
1523
+ - type: ndcg_at_5
1524
+ value: 38.016
1525
+ - type: precision_at_1
1526
+ value: 23.252
1527
+ - type: precision_at_10
1528
+ value: 6.569
1529
+ - type: precision_at_100
1530
+ value: 0.938
1531
+ - type: precision_at_1000
1532
+ value: 0.104
1533
+ - type: precision_at_3
1534
+ value: 14.479000000000001
1535
+ - type: precision_at_5
1536
+ value: 10.722
1537
+ - type: recall_at_1
1538
+ value: 22.6
1539
+ - type: recall_at_10
1540
+ value: 62.919000000000004
1541
+ - type: recall_at_100
1542
+ value: 88.82
1543
+ - type: recall_at_1000
1544
+ value: 97.71600000000001
1545
+ - type: recall_at_3
1546
+ value: 41.896
1547
+ - type: recall_at_5
1548
+ value: 51.537
1549
+ - task:
1550
+ type: Classification
1551
+ dataset:
1552
+ type: mteb/mtop_domain
1553
+ name: MTEB MTOPDomainClassification (en)
1554
+ config: en
1555
+ split: test
1556
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1557
+ metrics:
1558
+ - type: accuracy
1559
+ value: 93.69357045143639
1560
+ - type: f1
1561
+ value: 93.55489858177597
1562
+ - task:
1563
+ type: Classification
1564
+ dataset:
1565
+ type: mteb/mtop_intent
1566
+ name: MTEB MTOPIntentClassification (en)
1567
+ config: en
1568
+ split: test
1569
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1570
+ metrics:
1571
+ - type: accuracy
1572
+ value: 75.31235750114
1573
+ - type: f1
1574
+ value: 57.891491963121155
1575
+ - task:
1576
+ type: Classification
1577
+ dataset:
1578
+ type: mteb/amazon_massive_intent
1579
+ name: MTEB MassiveIntentClassification (en)
1580
+ config: en
1581
+ split: test
1582
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1583
+ metrics:
1584
+ - type: accuracy
1585
+ value: 73.04303967720243
1586
+ - type: f1
1587
+ value: 70.51516022297616
1588
+ - task:
1589
+ type: Classification
1590
+ dataset:
1591
+ type: mteb/amazon_massive_scenario
1592
+ name: MTEB MassiveScenarioClassification (en)
1593
+ config: en
1594
+ split: test
1595
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1596
+ metrics:
1597
+ - type: accuracy
1598
+ value: 77.65299260255549
1599
+ - type: f1
1600
+ value: 77.49059766538576
1601
+ - task:
1602
+ type: Clustering
1603
+ dataset:
1604
+ type: mteb/medrxiv-clustering-p2p
1605
+ name: MTEB MedrxivClusteringP2P
1606
+ config: default
1607
+ split: test
1608
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1609
+ metrics:
1610
+ - type: v_measure
1611
+ value: 31.458906115906597
1612
+ - task:
1613
+ type: Clustering
1614
+ dataset:
1615
+ type: mteb/medrxiv-clustering-s2s
1616
+ name: MTEB MedrxivClusteringS2S
1617
+ config: default
1618
+ split: test
1619
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1620
+ metrics:
1621
+ - type: v_measure
1622
+ value: 28.9851513122443
1623
+ - task:
1624
+ type: Reranking
1625
+ dataset:
1626
+ type: mteb/mind_small
1627
+ name: MTEB MindSmallReranking
1628
+ config: default
1629
+ split: test
1630
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1631
+ metrics:
1632
+ - type: map
1633
+ value: 31.2916268497217
1634
+ - type: mrr
1635
+ value: 32.328276715593816
1636
+ - task:
1637
+ type: Retrieval
1638
+ dataset:
1639
+ type: nfcorpus
1640
+ name: MTEB NFCorpus
1641
+ config: default
1642
+ split: test
1643
+ revision: None
1644
+ metrics:
1645
+ - type: map_at_1
1646
+ value: 6.3740000000000006
1647
+ - type: map_at_10
1648
+ value: 13.089999999999998
1649
+ - type: map_at_100
1650
+ value: 16.512
1651
+ - type: map_at_1000
1652
+ value: 18.014
1653
+ - type: map_at_3
1654
+ value: 9.671000000000001
1655
+ - type: map_at_5
1656
+ value: 11.199
1657
+ - type: mrr_at_1
1658
+ value: 46.749
1659
+ - type: mrr_at_10
1660
+ value: 55.367
1661
+ - type: mrr_at_100
1662
+ value: 56.021
1663
+ - type: mrr_at_1000
1664
+ value: 56.058
1665
+ - type: mrr_at_3
1666
+ value: 53.30200000000001
1667
+ - type: mrr_at_5
1668
+ value: 54.773
1669
+ - type: ndcg_at_1
1670
+ value: 45.046
1671
+ - type: ndcg_at_10
1672
+ value: 35.388999999999996
1673
+ - type: ndcg_at_100
1674
+ value: 32.175
1675
+ - type: ndcg_at_1000
1676
+ value: 41.018
1677
+ - type: ndcg_at_3
1678
+ value: 40.244
1679
+ - type: ndcg_at_5
1680
+ value: 38.267
1681
+ - type: precision_at_1
1682
+ value: 46.749
1683
+ - type: precision_at_10
1684
+ value: 26.563
1685
+ - type: precision_at_100
1686
+ value: 8.074
1687
+ - type: precision_at_1000
1688
+ value: 2.099
1689
+ - type: precision_at_3
1690
+ value: 37.358000000000004
1691
+ - type: precision_at_5
1692
+ value: 33.003
1693
+ - type: recall_at_1
1694
+ value: 6.3740000000000006
1695
+ - type: recall_at_10
1696
+ value: 16.805999999999997
1697
+ - type: recall_at_100
1698
+ value: 31.871
1699
+ - type: recall_at_1000
1700
+ value: 64.098
1701
+ - type: recall_at_3
1702
+ value: 10.383000000000001
1703
+ - type: recall_at_5
1704
+ value: 13.166
1705
+ - task:
1706
+ type: Retrieval
1707
+ dataset:
1708
+ type: nq
1709
+ name: MTEB NQ
1710
+ config: default
1711
+ split: test
1712
+ revision: None
1713
+ metrics:
1714
+ - type: map_at_1
1715
+ value: 34.847
1716
+ - type: map_at_10
1717
+ value: 50.532
1718
+ - type: map_at_100
1719
+ value: 51.504000000000005
1720
+ - type: map_at_1000
1721
+ value: 51.528
1722
+ - type: map_at_3
1723
+ value: 46.219
1724
+ - type: map_at_5
1725
+ value: 48.868
1726
+ - type: mrr_at_1
1727
+ value: 39.137
1728
+ - type: mrr_at_10
1729
+ value: 53.157
1730
+ - type: mrr_at_100
1731
+ value: 53.839999999999996
1732
+ - type: mrr_at_1000
1733
+ value: 53.857
1734
+ - type: mrr_at_3
1735
+ value: 49.667
1736
+ - type: mrr_at_5
1737
+ value: 51.847
1738
+ - type: ndcg_at_1
1739
+ value: 39.108
1740
+ - type: ndcg_at_10
1741
+ value: 58.221000000000004
1742
+ - type: ndcg_at_100
1743
+ value: 62.021
1744
+ - type: ndcg_at_1000
1745
+ value: 62.57
1746
+ - type: ndcg_at_3
1747
+ value: 50.27199999999999
1748
+ - type: ndcg_at_5
1749
+ value: 54.623999999999995
1750
+ - type: precision_at_1
1751
+ value: 39.108
1752
+ - type: precision_at_10
1753
+ value: 9.397
1754
+ - type: precision_at_100
1755
+ value: 1.1520000000000001
1756
+ - type: precision_at_1000
1757
+ value: 0.12
1758
+ - type: precision_at_3
1759
+ value: 22.644000000000002
1760
+ - type: precision_at_5
1761
+ value: 16.141
1762
+ - type: recall_at_1
1763
+ value: 34.847
1764
+ - type: recall_at_10
1765
+ value: 78.945
1766
+ - type: recall_at_100
1767
+ value: 94.793
1768
+ - type: recall_at_1000
1769
+ value: 98.904
1770
+ - type: recall_at_3
1771
+ value: 58.56
1772
+ - type: recall_at_5
1773
+ value: 68.535
1774
+ - task:
1775
+ type: Retrieval
1776
+ dataset:
1777
+ type: quora
1778
+ name: MTEB QuoraRetrieval
1779
+ config: default
1780
+ split: test
1781
+ revision: None
1782
+ metrics:
1783
+ - type: map_at_1
1784
+ value: 68.728
1785
+ - type: map_at_10
1786
+ value: 82.537
1787
+ - type: map_at_100
1788
+ value: 83.218
1789
+ - type: map_at_1000
1790
+ value: 83.238
1791
+ - type: map_at_3
1792
+ value: 79.586
1793
+ - type: map_at_5
1794
+ value: 81.416
1795
+ - type: mrr_at_1
1796
+ value: 79.17999999999999
1797
+ - type: mrr_at_10
1798
+ value: 85.79299999999999
1799
+ - type: mrr_at_100
1800
+ value: 85.937
1801
+ - type: mrr_at_1000
1802
+ value: 85.938
1803
+ - type: mrr_at_3
1804
+ value: 84.748
1805
+ - type: mrr_at_5
1806
+ value: 85.431
1807
+ - type: ndcg_at_1
1808
+ value: 79.17
1809
+ - type: ndcg_at_10
1810
+ value: 86.555
1811
+ - type: ndcg_at_100
1812
+ value: 88.005
1813
+ - type: ndcg_at_1000
1814
+ value: 88.146
1815
+ - type: ndcg_at_3
1816
+ value: 83.557
1817
+ - type: ndcg_at_5
1818
+ value: 85.152
1819
+ - type: precision_at_1
1820
+ value: 79.17
1821
+ - type: precision_at_10
1822
+ value: 13.163
1823
+ - type: precision_at_100
1824
+ value: 1.52
1825
+ - type: precision_at_1000
1826
+ value: 0.156
1827
+ - type: precision_at_3
1828
+ value: 36.53
1829
+ - type: precision_at_5
1830
+ value: 24.046
1831
+ - type: recall_at_1
1832
+ value: 68.728
1833
+ - type: recall_at_10
1834
+ value: 94.217
1835
+ - type: recall_at_100
1836
+ value: 99.295
1837
+ - type: recall_at_1000
1838
+ value: 99.964
1839
+ - type: recall_at_3
1840
+ value: 85.646
1841
+ - type: recall_at_5
1842
+ value: 90.113
1843
+ - task:
1844
+ type: Clustering
1845
+ dataset:
1846
+ type: mteb/reddit-clustering
1847
+ name: MTEB RedditClustering
1848
+ config: default
1849
+ split: test
1850
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1851
+ metrics:
1852
+ - type: v_measure
1853
+ value: 56.15680266226348
1854
+ - task:
1855
+ type: Clustering
1856
+ dataset:
1857
+ type: mteb/reddit-clustering-p2p
1858
+ name: MTEB RedditClusteringP2P
1859
+ config: default
1860
+ split: test
1861
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1862
+ metrics:
1863
+ - type: v_measure
1864
+ value: 63.4318549229047
1865
+ - task:
1866
+ type: Retrieval
1867
+ dataset:
1868
+ type: scidocs
1869
+ name: MTEB SCIDOCS
1870
+ config: default
1871
+ split: test
1872
+ revision: None
1873
+ metrics:
1874
+ - type: map_at_1
1875
+ value: 4.353
1876
+ - type: map_at_10
1877
+ value: 10.956000000000001
1878
+ - type: map_at_100
1879
+ value: 12.873999999999999
1880
+ - type: map_at_1000
1881
+ value: 13.177
1882
+ - type: map_at_3
1883
+ value: 7.854
1884
+ - type: map_at_5
1885
+ value: 9.327
1886
+ - type: mrr_at_1
1887
+ value: 21.4
1888
+ - type: mrr_at_10
1889
+ value: 31.948999999999998
1890
+ - type: mrr_at_100
1891
+ value: 33.039
1892
+ - type: mrr_at_1000
1893
+ value: 33.106
1894
+ - type: mrr_at_3
1895
+ value: 28.449999999999996
1896
+ - type: mrr_at_5
1897
+ value: 30.535
1898
+ - type: ndcg_at_1
1899
+ value: 21.4
1900
+ - type: ndcg_at_10
1901
+ value: 18.694
1902
+ - type: ndcg_at_100
1903
+ value: 26.275
1904
+ - type: ndcg_at_1000
1905
+ value: 31.836
1906
+ - type: ndcg_at_3
1907
+ value: 17.559
1908
+ - type: ndcg_at_5
1909
+ value: 15.372
1910
+ - type: precision_at_1
1911
+ value: 21.4
1912
+ - type: precision_at_10
1913
+ value: 9.790000000000001
1914
+ - type: precision_at_100
1915
+ value: 2.0709999999999997
1916
+ - type: precision_at_1000
1917
+ value: 0.34099999999999997
1918
+ - type: precision_at_3
1919
+ value: 16.467000000000002
1920
+ - type: precision_at_5
1921
+ value: 13.54
1922
+ - type: recall_at_1
1923
+ value: 4.353
1924
+ - type: recall_at_10
1925
+ value: 19.892000000000003
1926
+ - type: recall_at_100
1927
+ value: 42.067
1928
+ - type: recall_at_1000
1929
+ value: 69.268
1930
+ - type: recall_at_3
1931
+ value: 10.042
1932
+ - type: recall_at_5
1933
+ value: 13.741999999999999
1934
+ - task:
1935
+ type: STS
1936
+ dataset:
1937
+ type: mteb/sickr-sts
1938
+ name: MTEB SICK-R
1939
+ config: default
1940
+ split: test
1941
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1942
+ metrics:
1943
+ - type: cos_sim_pearson
1944
+ value: 83.75433886279843
1945
+ - type: cos_sim_spearman
1946
+ value: 78.29727771767095
1947
+ - type: euclidean_pearson
1948
+ value: 80.83057828506621
1949
+ - type: euclidean_spearman
1950
+ value: 78.35203149750356
1951
+ - type: manhattan_pearson
1952
+ value: 80.7403553891142
1953
+ - type: manhattan_spearman
1954
+ value: 78.33670488531051
1955
+ - task:
1956
+ type: STS
1957
+ dataset:
1958
+ type: mteb/sts12-sts
1959
+ name: MTEB STS12
1960
+ config: default
1961
+ split: test
1962
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1963
+ metrics:
1964
+ - type: cos_sim_pearson
1965
+ value: 84.59999465280839
1966
+ - type: cos_sim_spearman
1967
+ value: 75.79279003980383
1968
+ - type: euclidean_pearson
1969
+ value: 82.29895375956758
1970
+ - type: euclidean_spearman
1971
+ value: 77.33856514102094
1972
+ - type: manhattan_pearson
1973
+ value: 82.22694214534756
1974
+ - type: manhattan_spearman
1975
+ value: 77.3028993008695
1976
+ - task:
1977
+ type: STS
1978
+ dataset:
1979
+ type: mteb/sts13-sts
1980
+ name: MTEB STS13
1981
+ config: default
1982
+ split: test
1983
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1984
+ metrics:
1985
+ - type: cos_sim_pearson
1986
+ value: 83.09296929691297
1987
+ - type: cos_sim_spearman
1988
+ value: 83.58056936846941
1989
+ - type: euclidean_pearson
1990
+ value: 83.84067483060005
1991
+ - type: euclidean_spearman
1992
+ value: 84.45155680480985
1993
+ - type: manhattan_pearson
1994
+ value: 83.82353052971942
1995
+ - type: manhattan_spearman
1996
+ value: 84.43030567861112
1997
+ - task:
1998
+ type: STS
1999
+ dataset:
2000
+ type: mteb/sts14-sts
2001
+ name: MTEB STS14
2002
+ config: default
2003
+ split: test
2004
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
2005
+ metrics:
2006
+ - type: cos_sim_pearson
2007
+ value: 82.74616852320915
2008
+ - type: cos_sim_spearman
2009
+ value: 79.948683747966
2010
+ - type: euclidean_pearson
2011
+ value: 81.55702283757084
2012
+ - type: euclidean_spearman
2013
+ value: 80.1721505114231
2014
+ - type: manhattan_pearson
2015
+ value: 81.52251518619441
2016
+ - type: manhattan_spearman
2017
+ value: 80.1469800135577
2018
+ - task:
2019
+ type: STS
2020
+ dataset:
2021
+ type: mteb/sts15-sts
2022
+ name: MTEB STS15
2023
+ config: default
2024
+ split: test
2025
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
2026
+ metrics:
2027
+ - type: cos_sim_pearson
2028
+ value: 87.97170104226318
2029
+ - type: cos_sim_spearman
2030
+ value: 88.82021731518206
2031
+ - type: euclidean_pearson
2032
+ value: 87.92950547187615
2033
+ - type: euclidean_spearman
2034
+ value: 88.67043634645866
2035
+ - type: manhattan_pearson
2036
+ value: 87.90668112827639
2037
+ - type: manhattan_spearman
2038
+ value: 88.64471082785317
2039
+ - task:
2040
+ type: STS
2041
+ dataset:
2042
+ type: mteb/sts16-sts
2043
+ name: MTEB STS16
2044
+ config: default
2045
+ split: test
2046
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
2047
+ metrics:
2048
+ - type: cos_sim_pearson
2049
+ value: 83.02790375770599
2050
+ - type: cos_sim_spearman
2051
+ value: 84.46308496590792
2052
+ - type: euclidean_pearson
2053
+ value: 84.29430000414911
2054
+ - type: euclidean_spearman
2055
+ value: 84.77298303589936
2056
+ - type: manhattan_pearson
2057
+ value: 84.23919291368665
2058
+ - type: manhattan_spearman
2059
+ value: 84.75272234871308
2060
+ - task:
2061
+ type: STS
2062
+ dataset:
2063
+ type: mteb/sts17-crosslingual-sts
2064
+ name: MTEB STS17 (en-en)
2065
+ config: en-en
2066
+ split: test
2067
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
2068
+ metrics:
2069
+ - type: cos_sim_pearson
2070
+ value: 87.62885108477064
2071
+ - type: cos_sim_spearman
2072
+ value: 87.58456196391622
2073
+ - type: euclidean_pearson
2074
+ value: 88.2602775281007
2075
+ - type: euclidean_spearman
2076
+ value: 87.51556278299846
2077
+ - type: manhattan_pearson
2078
+ value: 88.11224053672842
2079
+ - type: manhattan_spearman
2080
+ value: 87.4336094383095
2081
+ - task:
2082
+ type: STS
2083
+ dataset:
2084
+ type: mteb/sts22-crosslingual-sts
2085
+ name: MTEB STS22 (en)
2086
+ config: en
2087
+ split: test
2088
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2089
+ metrics:
2090
+ - type: cos_sim_pearson
2091
+ value: 63.98187965128411
2092
+ - type: cos_sim_spearman
2093
+ value: 64.0653163219731
2094
+ - type: euclidean_pearson
2095
+ value: 62.30616725924099
2096
+ - type: euclidean_spearman
2097
+ value: 61.556971332295916
2098
+ - type: manhattan_pearson
2099
+ value: 62.07642330128549
2100
+ - type: manhattan_spearman
2101
+ value: 61.155494129828
2102
+ - task:
2103
+ type: STS
2104
+ dataset:
2105
+ type: mteb/stsbenchmark-sts
2106
+ name: MTEB STSBenchmark
2107
+ config: default
2108
+ split: test
2109
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2110
+ metrics:
2111
+ - type: cos_sim_pearson
2112
+ value: 85.6089703921826
2113
+ - type: cos_sim_spearman
2114
+ value: 86.52303197250791
2115
+ - type: euclidean_pearson
2116
+ value: 85.95801955963246
2117
+ - type: euclidean_spearman
2118
+ value: 86.25242424112962
2119
+ - type: manhattan_pearson
2120
+ value: 85.88829100470312
2121
+ - type: manhattan_spearman
2122
+ value: 86.18742955805165
2123
+ - task:
2124
+ type: Reranking
2125
+ dataset:
2126
+ type: mteb/scidocs-reranking
2127
+ name: MTEB SciDocsRR
2128
+ config: default
2129
+ split: test
2130
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2131
+ metrics:
2132
+ - type: map
2133
+ value: 83.02282098487036
2134
+ - type: mrr
2135
+ value: 95.05126409538174
2136
+ - task:
2137
+ type: Retrieval
2138
+ dataset:
2139
+ type: scifact
2140
+ name: MTEB SciFact
2141
+ config: default
2142
+ split: test
2143
+ revision: None
2144
+ metrics:
2145
+ - type: map_at_1
2146
+ value: 55.928
2147
+ - type: map_at_10
2148
+ value: 67.308
2149
+ - type: map_at_100
2150
+ value: 67.89500000000001
2151
+ - type: map_at_1000
2152
+ value: 67.91199999999999
2153
+ - type: map_at_3
2154
+ value: 65.091
2155
+ - type: map_at_5
2156
+ value: 66.412
2157
+ - type: mrr_at_1
2158
+ value: 58.667
2159
+ - type: mrr_at_10
2160
+ value: 68.401
2161
+ - type: mrr_at_100
2162
+ value: 68.804
2163
+ - type: mrr_at_1000
2164
+ value: 68.819
2165
+ - type: mrr_at_3
2166
+ value: 66.72200000000001
2167
+ - type: mrr_at_5
2168
+ value: 67.72200000000001
2169
+ - type: ndcg_at_1
2170
+ value: 58.667
2171
+ - type: ndcg_at_10
2172
+ value: 71.944
2173
+ - type: ndcg_at_100
2174
+ value: 74.464
2175
+ - type: ndcg_at_1000
2176
+ value: 74.82799999999999
2177
+ - type: ndcg_at_3
2178
+ value: 68.257
2179
+ - type: ndcg_at_5
2180
+ value: 70.10300000000001
2181
+ - type: precision_at_1
2182
+ value: 58.667
2183
+ - type: precision_at_10
2184
+ value: 9.533
2185
+ - type: precision_at_100
2186
+ value: 1.09
2187
+ - type: precision_at_1000
2188
+ value: 0.11199999999999999
2189
+ - type: precision_at_3
2190
+ value: 27.222
2191
+ - type: precision_at_5
2192
+ value: 17.533
2193
+ - type: recall_at_1
2194
+ value: 55.928
2195
+ - type: recall_at_10
2196
+ value: 84.65
2197
+ - type: recall_at_100
2198
+ value: 96.267
2199
+ - type: recall_at_1000
2200
+ value: 99
2201
+ - type: recall_at_3
2202
+ value: 74.656
2203
+ - type: recall_at_5
2204
+ value: 79.489
2205
+ - task:
2206
+ type: PairClassification
2207
+ dataset:
2208
+ type: mteb/sprintduplicatequestions-pairclassification
2209
+ name: MTEB SprintDuplicateQuestions
2210
+ config: default
2211
+ split: test
2212
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2213
+ metrics:
2214
+ - type: cos_sim_accuracy
2215
+ value: 99.79009900990098
2216
+ - type: cos_sim_ap
2217
+ value: 94.5795129511524
2218
+ - type: cos_sim_f1
2219
+ value: 89.34673366834171
2220
+ - type: cos_sim_precision
2221
+ value: 89.79797979797979
2222
+ - type: cos_sim_recall
2223
+ value: 88.9
2224
+ - type: dot_accuracy
2225
+ value: 99.53465346534654
2226
+ - type: dot_ap
2227
+ value: 81.56492504352725
2228
+ - type: dot_f1
2229
+ value: 76.33816908454227
2230
+ - type: dot_precision
2231
+ value: 76.37637637637637
2232
+ - type: dot_recall
2233
+ value: 76.3
2234
+ - type: euclidean_accuracy
2235
+ value: 99.78514851485149
2236
+ - type: euclidean_ap
2237
+ value: 94.59134620408962
2238
+ - type: euclidean_f1
2239
+ value: 88.96484375
2240
+ - type: euclidean_precision
2241
+ value: 86.92748091603053
2242
+ - type: euclidean_recall
2243
+ value: 91.10000000000001
2244
+ - type: manhattan_accuracy
2245
+ value: 99.78415841584159
2246
+ - type: manhattan_ap
2247
+ value: 94.5190197328845
2248
+ - type: manhattan_f1
2249
+ value: 88.84462151394423
2250
+ - type: manhattan_precision
2251
+ value: 88.4920634920635
2252
+ - type: manhattan_recall
2253
+ value: 89.2
2254
+ - type: max_accuracy
2255
+ value: 99.79009900990098
2256
+ - type: max_ap
2257
+ value: 94.59134620408962
2258
+ - type: max_f1
2259
+ value: 89.34673366834171
2260
+ - task:
2261
+ type: Clustering
2262
+ dataset:
2263
+ type: mteb/stackexchange-clustering
2264
+ name: MTEB StackExchangeClustering
2265
+ config: default
2266
+ split: test
2267
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2268
+ metrics:
2269
+ - type: v_measure
2270
+ value: 65.1487505617497
2271
+ - task:
2272
+ type: Clustering
2273
+ dataset:
2274
+ type: mteb/stackexchange-clustering-p2p
2275
+ name: MTEB StackExchangeClusteringP2P
2276
+ config: default
2277
+ split: test
2278
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2279
+ metrics:
2280
+ - type: v_measure
2281
+ value: 32.502518166001856
2282
+ - task:
2283
+ type: Reranking
2284
+ dataset:
2285
+ type: mteb/stackoverflowdupquestions-reranking
2286
+ name: MTEB StackOverflowDupQuestions
2287
+ config: default
2288
+ split: test
2289
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2290
+ metrics:
2291
+ - type: map
2292
+ value: 50.33775480236701
2293
+ - type: mrr
2294
+ value: 51.17302223919871
2295
+ - task:
2296
+ type: Summarization
2297
+ dataset:
2298
+ type: mteb/summeval
2299
+ name: MTEB SummEval
2300
+ config: default
2301
+ split: test
2302
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2303
+ metrics:
2304
+ - type: cos_sim_pearson
2305
+ value: 30.561111309808208
2306
+ - type: cos_sim_spearman
2307
+ value: 30.2839254379273
2308
+ - type: dot_pearson
2309
+ value: 29.560242291401973
2310
+ - type: dot_spearman
2311
+ value: 30.51527274679116
2312
+ - task:
2313
+ type: Retrieval
2314
+ dataset:
2315
+ type: trec-covid
2316
+ name: MTEB TRECCOVID
2317
+ config: default
2318
+ split: test
2319
+ revision: None
2320
+ metrics:
2321
+ - type: map_at_1
2322
+ value: 0.215
2323
+ - type: map_at_10
2324
+ value: 1.752
2325
+ - type: map_at_100
2326
+ value: 9.258
2327
+ - type: map_at_1000
2328
+ value: 23.438
2329
+ - type: map_at_3
2330
+ value: 0.6
2331
+ - type: map_at_5
2332
+ value: 0.968
2333
+ - type: mrr_at_1
2334
+ value: 84
2335
+ - type: mrr_at_10
2336
+ value: 91.333
2337
+ - type: mrr_at_100
2338
+ value: 91.333
2339
+ - type: mrr_at_1000
2340
+ value: 91.333
2341
+ - type: mrr_at_3
2342
+ value: 91.333
2343
+ - type: mrr_at_5
2344
+ value: 91.333
2345
+ - type: ndcg_at_1
2346
+ value: 75
2347
+ - type: ndcg_at_10
2348
+ value: 69.596
2349
+ - type: ndcg_at_100
2350
+ value: 51.970000000000006
2351
+ - type: ndcg_at_1000
2352
+ value: 48.864999999999995
2353
+ - type: ndcg_at_3
2354
+ value: 73.92699999999999
2355
+ - type: ndcg_at_5
2356
+ value: 73.175
2357
+ - type: precision_at_1
2358
+ value: 84
2359
+ - type: precision_at_10
2360
+ value: 74
2361
+ - type: precision_at_100
2362
+ value: 53.2
2363
+ - type: precision_at_1000
2364
+ value: 21.836
2365
+ - type: precision_at_3
2366
+ value: 79.333
2367
+ - type: precision_at_5
2368
+ value: 78.4
2369
+ - type: recall_at_1
2370
+ value: 0.215
2371
+ - type: recall_at_10
2372
+ value: 1.9609999999999999
2373
+ - type: recall_at_100
2374
+ value: 12.809999999999999
2375
+ - type: recall_at_1000
2376
+ value: 46.418
2377
+ - type: recall_at_3
2378
+ value: 0.6479999999999999
2379
+ - type: recall_at_5
2380
+ value: 1.057
2381
+ - task:
2382
+ type: Retrieval
2383
+ dataset:
2384
+ type: webis-touche2020
2385
+ name: MTEB Touche2020
2386
+ config: default
2387
+ split: test
2388
+ revision: None
2389
+ metrics:
2390
+ - type: map_at_1
2391
+ value: 3.066
2392
+ - type: map_at_10
2393
+ value: 10.508000000000001
2394
+ - type: map_at_100
2395
+ value: 16.258
2396
+ - type: map_at_1000
2397
+ value: 17.705000000000002
2398
+ - type: map_at_3
2399
+ value: 6.157
2400
+ - type: map_at_5
2401
+ value: 7.510999999999999
2402
+ - type: mrr_at_1
2403
+ value: 34.694
2404
+ - type: mrr_at_10
2405
+ value: 48.786
2406
+ - type: mrr_at_100
2407
+ value: 49.619
2408
+ - type: mrr_at_1000
2409
+ value: 49.619
2410
+ - type: mrr_at_3
2411
+ value: 45.918
2412
+ - type: mrr_at_5
2413
+ value: 46.837
2414
+ - type: ndcg_at_1
2415
+ value: 31.633
2416
+ - type: ndcg_at_10
2417
+ value: 26.401999999999997
2418
+ - type: ndcg_at_100
2419
+ value: 37.139
2420
+ - type: ndcg_at_1000
2421
+ value: 48.012
2422
+ - type: ndcg_at_3
2423
+ value: 31.875999999999998
2424
+ - type: ndcg_at_5
2425
+ value: 27.383000000000003
2426
+ - type: precision_at_1
2427
+ value: 34.694
2428
+ - type: precision_at_10
2429
+ value: 22.857
2430
+ - type: precision_at_100
2431
+ value: 7.611999999999999
2432
+ - type: precision_at_1000
2433
+ value: 1.492
2434
+ - type: precision_at_3
2435
+ value: 33.333
2436
+ - type: precision_at_5
2437
+ value: 26.122
2438
+ - type: recall_at_1
2439
+ value: 3.066
2440
+ - type: recall_at_10
2441
+ value: 16.239
2442
+ - type: recall_at_100
2443
+ value: 47.29
2444
+ - type: recall_at_1000
2445
+ value: 81.137
2446
+ - type: recall_at_3
2447
+ value: 7.069
2448
+ - type: recall_at_5
2449
+ value: 9.483
2450
+ - task:
2451
+ type: Classification
2452
+ dataset:
2453
+ type: mteb/toxic_conversations_50k
2454
+ name: MTEB ToxicConversationsClassification
2455
+ config: default
2456
+ split: test
2457
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2458
+ metrics:
2459
+ - type: accuracy
2460
+ value: 72.1126
2461
+ - type: ap
2462
+ value: 14.710862719285753
2463
+ - type: f1
2464
+ value: 55.437808972378846
2465
+ - task:
2466
+ type: Classification
2467
+ dataset:
2468
+ type: mteb/tweet_sentiment_extraction
2469
+ name: MTEB TweetSentimentExtractionClassification
2470
+ config: default
2471
+ split: test
2472
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2473
+ metrics:
2474
+ - type: accuracy
2475
+ value: 60.39049235993209
2476
+ - type: f1
2477
+ value: 60.69810537250234
2478
+ - task:
2479
+ type: Clustering
2480
+ dataset:
2481
+ type: mteb/twentynewsgroups-clustering
2482
+ name: MTEB TwentyNewsgroupsClustering
2483
+ config: default
2484
+ split: test
2485
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2486
+ metrics:
2487
+ - type: v_measure
2488
+ value: 48.15576640316866
2489
+ - task:
2490
+ type: PairClassification
2491
+ dataset:
2492
+ type: mteb/twittersemeval2015-pairclassification
2493
+ name: MTEB TwitterSemEval2015
2494
+ config: default
2495
+ split: test
2496
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2497
+ metrics:
2498
+ - type: cos_sim_accuracy
2499
+ value: 86.52917684925792
2500
+ - type: cos_sim_ap
2501
+ value: 75.97497873817315
2502
+ - type: cos_sim_f1
2503
+ value: 70.01151926276718
2504
+ - type: cos_sim_precision
2505
+ value: 67.98409147402435
2506
+ - type: cos_sim_recall
2507
+ value: 72.16358839050132
2508
+ - type: dot_accuracy
2509
+ value: 82.47004828038385
2510
+ - type: dot_ap
2511
+ value: 62.48739894974198
2512
+ - type: dot_f1
2513
+ value: 59.13107511045656
2514
+ - type: dot_precision
2515
+ value: 55.27765029830197
2516
+ - type: dot_recall
2517
+ value: 63.562005277044854
2518
+ - type: euclidean_accuracy
2519
+ value: 86.46361089586935
2520
+ - type: euclidean_ap
2521
+ value: 75.59282886839452
2522
+ - type: euclidean_f1
2523
+ value: 69.6465443945099
2524
+ - type: euclidean_precision
2525
+ value: 64.52847175331982
2526
+ - type: euclidean_recall
2527
+ value: 75.64643799472296
2528
+ - type: manhattan_accuracy
2529
+ value: 86.43380818978363
2530
+ - type: manhattan_ap
2531
+ value: 75.5742420974403
2532
+ - type: manhattan_f1
2533
+ value: 69.8636926889715
2534
+ - type: manhattan_precision
2535
+ value: 65.8644859813084
2536
+ - type: manhattan_recall
2537
+ value: 74.37994722955145
2538
+ - type: max_accuracy
2539
+ value: 86.52917684925792
2540
+ - type: max_ap
2541
+ value: 75.97497873817315
2542
+ - type: max_f1
2543
+ value: 70.01151926276718
2544
+ - task:
2545
+ type: PairClassification
2546
+ dataset:
2547
+ type: mteb/twitterurlcorpus-pairclassification
2548
+ name: MTEB TwitterURLCorpus
2549
+ config: default
2550
+ split: test
2551
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2552
+ metrics:
2553
+ - type: cos_sim_accuracy
2554
+ value: 89.29056545193464
2555
+ - type: cos_sim_ap
2556
+ value: 86.63028865482376
2557
+ - type: cos_sim_f1
2558
+ value: 79.18166458532285
2559
+ - type: cos_sim_precision
2560
+ value: 75.70585756426465
2561
+ - type: cos_sim_recall
2562
+ value: 82.99199260856174
2563
+ - type: dot_accuracy
2564
+ value: 85.23305002522606
2565
+ - type: dot_ap
2566
+ value: 76.0482687263196
2567
+ - type: dot_f1
2568
+ value: 70.80484330484332
2569
+ - type: dot_precision
2570
+ value: 65.86933474688577
2571
+ - type: dot_recall
2572
+ value: 76.53988296889437
2573
+ - type: euclidean_accuracy
2574
+ value: 89.26145845461248
2575
+ - type: euclidean_ap
2576
+ value: 86.54073288416006
2577
+ - type: euclidean_f1
2578
+ value: 78.9721371479794
2579
+ - type: euclidean_precision
2580
+ value: 76.68649354417525
2581
+ - type: euclidean_recall
2582
+ value: 81.39821373575609
2583
+ - type: manhattan_accuracy
2584
+ value: 89.22847052431405
2585
+ - type: manhattan_ap
2586
+ value: 86.51250729037905
2587
+ - type: manhattan_f1
2588
+ value: 78.94601825044894
2589
+ - type: manhattan_precision
2590
+ value: 75.32694594027555
2591
+ - type: manhattan_recall
2592
+ value: 82.93039728980598
2593
+ - type: max_accuracy
2594
+ value: 89.29056545193464
2595
+ - type: max_ap
2596
+ value: 86.63028865482376
2597
+ - type: max_f1
2598
+ value: 79.18166458532285
2599
+ language:
2600
+ - en
2601
+ license: mit
2602
+ ---
2603
+
2604
+ # E5-base-v2
2605
+
2606
+ [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
2607
+ Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
2608
+
2609
+ This model has 12 layers and the embedding size is 768.
2610
+
2611
+ ## Usage
2612
+
2613
+ Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
2614
+
2615
+ ```python
2616
+ import torch.nn.functional as F
2617
+
2618
+ from torch import Tensor
2619
+ from transformers import AutoTokenizer, AutoModel
2620
+
2621
+
2622
+ def average_pool(last_hidden_states: Tensor,
2623
+ attention_mask: Tensor) -> Tensor:
2624
+ last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
2625
+ return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
2626
+
2627
+
2628
+ # Each input text should start with "query: " or "passage: ".
2629
+ # For tasks other than retrieval, you can simply use the "query: " prefix.
2630
+ input_texts = ['query: how much protein should a female eat',
2631
+ 'query: summit define',
2632
+ "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
2633
+ "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
2634
+
2635
+ tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-base-v2')
2636
+ model = AutoModel.from_pretrained('intfloat/e5-base-v2')
2637
+
2638
+ # Tokenize the input texts
2639
+ batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
2640
+
2641
+ outputs = model(**batch_dict)
2642
+ embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
2643
+
2644
+ # (Optionally) normalize embeddings
2645
+ embeddings = F.normalize(embeddings, p=2, dim=1)
2646
+ scores = (embeddings[:2] @ embeddings[2:].T) * 100
2647
+ print(scores.tolist())
2648
+ ```
2649
+
2650
+ ## Training Details
2651
+
2652
+ Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
2653
+
2654
+ ## Benchmark Evaluation
2655
+
2656
+ Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
2657
+ on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
2658
+
2659
+ ## Citation
2660
+
2661
+ If you find our paper or models helpful, please consider cite as follows:
2662
+
2663
+ ```
2664
+ @article{wang2022text,
2665
+ title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
2666
+ author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
2667
+ journal={arXiv preprint arXiv:2212.03533},
2668
+ year={2022}
2669
+ }
2670
+ ```
2671
+
2672
+ ## Limitations
2673
+
2674
+ This model only works for English texts. Long texts will be truncated to at most 512 tokens.
config.json ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "tmp/",
3
+ "architectures": [
4
+ "BertModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "gradient_checkpointing": false,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-12,
15
+ "max_position_embeddings": 512,
16
+ "model_type": "bert",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 0,
20
+ "position_embedding_type": "absolute",
21
+ "torch_dtype": "float32",
22
+ "transformers_version": "4.29.0.dev0",
23
+ "type_vocab_size": 2,
24
+ "use_cache": true,
25
+ "vocab_size": 30522
26
+ }
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5b119cd34c663c26fcd8bbe82e9873e9a16c6588b9817a4243947a6de478c273
3
+ size 437997357
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": "[CLS]",
3
+ "mask_token": "[MASK]",
4
+ "pad_token": "[PAD]",
5
+ "sep_token": "[SEP]",
6
+ "unk_token": "[UNK]"
7
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,13 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "clean_up_tokenization_spaces": true,
3
+ "cls_token": "[CLS]",
4
+ "do_lower_case": true,
5
+ "mask_token": "[MASK]",
6
+ "model_max_length": 512,
7
+ "pad_token": "[PAD]",
8
+ "sep_token": "[SEP]",
9
+ "strip_accents": null,
10
+ "tokenize_chinese_chars": true,
11
+ "tokenizer_class": "BertTokenizer",
12
+ "unk_token": "[UNK]"
13
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff