michaelfeil commited on
Commit
14006c8
1 Parent(s): a58dace

Upload intfloat/e5-small ctranslate fp16 weights

Browse files
.gitattributes CHANGED
@@ -25,7 +25,6 @@
25
  *.safetensors filter=lfs diff=lfs merge=lfs -text
26
  saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
  *.tar.* filter=lfs diff=lfs merge=lfs -text
28
- *.tar filter=lfs diff=lfs merge=lfs -text
29
  *.tflite filter=lfs diff=lfs merge=lfs -text
30
  *.tgz filter=lfs diff=lfs merge=lfs -text
31
  *.wasm filter=lfs diff=lfs merge=lfs -text
 
25
  *.safetensors filter=lfs diff=lfs merge=lfs -text
26
  saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
  *.tar.* filter=lfs diff=lfs merge=lfs -text
 
28
  *.tflite filter=lfs diff=lfs merge=lfs -text
29
  *.tgz filter=lfs diff=lfs merge=lfs -text
30
  *.wasm filter=lfs diff=lfs merge=lfs -text
README.md ADDED
@@ -0,0 +1,2719 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - ctranslate2
4
+ - int8
5
+ - float16
6
+ - mteb
7
+ model-index:
8
+ - name: e5-small
9
+ results:
10
+ - task:
11
+ type: Classification
12
+ dataset:
13
+ type: mteb/amazon_counterfactual
14
+ name: MTEB AmazonCounterfactualClassification (en)
15
+ config: en
16
+ split: test
17
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
18
+ metrics:
19
+ - type: accuracy
20
+ value: 76.22388059701493
21
+ - type: ap
22
+ value: 40.27466219523129
23
+ - type: f1
24
+ value: 70.60533006025108
25
+ - task:
26
+ type: Classification
27
+ dataset:
28
+ type: mteb/amazon_polarity
29
+ name: MTEB AmazonPolarityClassification
30
+ config: default
31
+ split: test
32
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
33
+ metrics:
34
+ - type: accuracy
35
+ value: 87.525775
36
+ - type: ap
37
+ value: 83.51063993897611
38
+ - type: f1
39
+ value: 87.49342736805572
40
+ - task:
41
+ type: Classification
42
+ dataset:
43
+ type: mteb/amazon_reviews_multi
44
+ name: MTEB AmazonReviewsClassification (en)
45
+ config: en
46
+ split: test
47
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
48
+ metrics:
49
+ - type: accuracy
50
+ value: 42.611999999999995
51
+ - type: f1
52
+ value: 42.05088045932892
53
+ - task:
54
+ type: Retrieval
55
+ dataset:
56
+ type: arguana
57
+ name: MTEB ArguAna
58
+ config: default
59
+ split: test
60
+ revision: None
61
+ metrics:
62
+ - type: map_at_1
63
+ value: 23.826
64
+ - type: map_at_10
65
+ value: 38.269
66
+ - type: map_at_100
67
+ value: 39.322
68
+ - type: map_at_1000
69
+ value: 39.344
70
+ - type: map_at_3
71
+ value: 33.428000000000004
72
+ - type: map_at_5
73
+ value: 36.063
74
+ - type: mrr_at_1
75
+ value: 24.253
76
+ - type: mrr_at_10
77
+ value: 38.425
78
+ - type: mrr_at_100
79
+ value: 39.478
80
+ - type: mrr_at_1000
81
+ value: 39.5
82
+ - type: mrr_at_3
83
+ value: 33.606
84
+ - type: mrr_at_5
85
+ value: 36.195
86
+ - type: ndcg_at_1
87
+ value: 23.826
88
+ - type: ndcg_at_10
89
+ value: 46.693
90
+ - type: ndcg_at_100
91
+ value: 51.469
92
+ - type: ndcg_at_1000
93
+ value: 52.002
94
+ - type: ndcg_at_3
95
+ value: 36.603
96
+ - type: ndcg_at_5
97
+ value: 41.365
98
+ - type: precision_at_1
99
+ value: 23.826
100
+ - type: precision_at_10
101
+ value: 7.383000000000001
102
+ - type: precision_at_100
103
+ value: 0.9530000000000001
104
+ - type: precision_at_1000
105
+ value: 0.099
106
+ - type: precision_at_3
107
+ value: 15.268
108
+ - type: precision_at_5
109
+ value: 11.479000000000001
110
+ - type: recall_at_1
111
+ value: 23.826
112
+ - type: recall_at_10
113
+ value: 73.82600000000001
114
+ - type: recall_at_100
115
+ value: 95.306
116
+ - type: recall_at_1000
117
+ value: 99.431
118
+ - type: recall_at_3
119
+ value: 45.804
120
+ - type: recall_at_5
121
+ value: 57.397
122
+ - task:
123
+ type: Clustering
124
+ dataset:
125
+ type: mteb/arxiv-clustering-p2p
126
+ name: MTEB ArxivClusteringP2P
127
+ config: default
128
+ split: test
129
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
130
+ metrics:
131
+ - type: v_measure
132
+ value: 44.13995374767436
133
+ - task:
134
+ type: Clustering
135
+ dataset:
136
+ type: mteb/arxiv-clustering-s2s
137
+ name: MTEB ArxivClusteringS2S
138
+ config: default
139
+ split: test
140
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
141
+ metrics:
142
+ - type: v_measure
143
+ value: 37.13950072624313
144
+ - task:
145
+ type: Reranking
146
+ dataset:
147
+ type: mteb/askubuntudupquestions-reranking
148
+ name: MTEB AskUbuntuDupQuestions
149
+ config: default
150
+ split: test
151
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
152
+ metrics:
153
+ - type: map
154
+ value: 59.35843292105327
155
+ - type: mrr
156
+ value: 73.72312359846987
157
+ - task:
158
+ type: STS
159
+ dataset:
160
+ type: mteb/biosses-sts
161
+ name: MTEB BIOSSES
162
+ config: default
163
+ split: test
164
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
165
+ metrics:
166
+ - type: cos_sim_pearson
167
+ value: 84.55140418324174
168
+ - type: cos_sim_spearman
169
+ value: 84.21637675860022
170
+ - type: euclidean_pearson
171
+ value: 81.26069614610006
172
+ - type: euclidean_spearman
173
+ value: 83.25069210421785
174
+ - type: manhattan_pearson
175
+ value: 80.17441422581014
176
+ - type: manhattan_spearman
177
+ value: 81.87596198487877
178
+ - task:
179
+ type: Classification
180
+ dataset:
181
+ type: mteb/banking77
182
+ name: MTEB Banking77Classification
183
+ config: default
184
+ split: test
185
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
186
+ metrics:
187
+ - type: accuracy
188
+ value: 81.87337662337661
189
+ - type: f1
190
+ value: 81.76647866926402
191
+ - task:
192
+ type: Clustering
193
+ dataset:
194
+ type: mteb/biorxiv-clustering-p2p
195
+ name: MTEB BiorxivClusteringP2P
196
+ config: default
197
+ split: test
198
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
199
+ metrics:
200
+ - type: v_measure
201
+ value: 35.80600542614507
202
+ - task:
203
+ type: Clustering
204
+ dataset:
205
+ type: mteb/biorxiv-clustering-s2s
206
+ name: MTEB BiorxivClusteringS2S
207
+ config: default
208
+ split: test
209
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
210
+ metrics:
211
+ - type: v_measure
212
+ value: 31.86321613256603
213
+ - task:
214
+ type: Retrieval
215
+ dataset:
216
+ type: BeIR/cqadupstack
217
+ name: MTEB CQADupstackAndroidRetrieval
218
+ config: default
219
+ split: test
220
+ revision: None
221
+ metrics:
222
+ - type: map_at_1
223
+ value: 32.054
224
+ - type: map_at_10
225
+ value: 40.699999999999996
226
+ - type: map_at_100
227
+ value: 41.818
228
+ - type: map_at_1000
229
+ value: 41.959999999999994
230
+ - type: map_at_3
231
+ value: 37.742
232
+ - type: map_at_5
233
+ value: 39.427
234
+ - type: mrr_at_1
235
+ value: 38.769999999999996
236
+ - type: mrr_at_10
237
+ value: 46.150000000000006
238
+ - type: mrr_at_100
239
+ value: 46.865
240
+ - type: mrr_at_1000
241
+ value: 46.925
242
+ - type: mrr_at_3
243
+ value: 43.705
244
+ - type: mrr_at_5
245
+ value: 45.214999999999996
246
+ - type: ndcg_at_1
247
+ value: 38.769999999999996
248
+ - type: ndcg_at_10
249
+ value: 45.778
250
+ - type: ndcg_at_100
251
+ value: 50.38
252
+ - type: ndcg_at_1000
253
+ value: 52.922999999999995
254
+ - type: ndcg_at_3
255
+ value: 41.597
256
+ - type: ndcg_at_5
257
+ value: 43.631
258
+ - type: precision_at_1
259
+ value: 38.769999999999996
260
+ - type: precision_at_10
261
+ value: 8.269
262
+ - type: precision_at_100
263
+ value: 1.278
264
+ - type: precision_at_1000
265
+ value: 0.178
266
+ - type: precision_at_3
267
+ value: 19.266
268
+ - type: precision_at_5
269
+ value: 13.705
270
+ - type: recall_at_1
271
+ value: 32.054
272
+ - type: recall_at_10
273
+ value: 54.947
274
+ - type: recall_at_100
275
+ value: 74.79599999999999
276
+ - type: recall_at_1000
277
+ value: 91.40899999999999
278
+ - type: recall_at_3
279
+ value: 42.431000000000004
280
+ - type: recall_at_5
281
+ value: 48.519
282
+ - task:
283
+ type: Retrieval
284
+ dataset:
285
+ type: BeIR/cqadupstack
286
+ name: MTEB CQADupstackEnglishRetrieval
287
+ config: default
288
+ split: test
289
+ revision: None
290
+ metrics:
291
+ - type: map_at_1
292
+ value: 29.035
293
+ - type: map_at_10
294
+ value: 38.007000000000005
295
+ - type: map_at_100
296
+ value: 39.125
297
+ - type: map_at_1000
298
+ value: 39.251999999999995
299
+ - type: map_at_3
300
+ value: 35.77
301
+ - type: map_at_5
302
+ value: 37.057
303
+ - type: mrr_at_1
304
+ value: 36.497
305
+ - type: mrr_at_10
306
+ value: 44.077
307
+ - type: mrr_at_100
308
+ value: 44.743
309
+ - type: mrr_at_1000
310
+ value: 44.79
311
+ - type: mrr_at_3
312
+ value: 42.123
313
+ - type: mrr_at_5
314
+ value: 43.308
315
+ - type: ndcg_at_1
316
+ value: 36.497
317
+ - type: ndcg_at_10
318
+ value: 42.986000000000004
319
+ - type: ndcg_at_100
320
+ value: 47.323
321
+ - type: ndcg_at_1000
322
+ value: 49.624
323
+ - type: ndcg_at_3
324
+ value: 39.805
325
+ - type: ndcg_at_5
326
+ value: 41.286
327
+ - type: precision_at_1
328
+ value: 36.497
329
+ - type: precision_at_10
330
+ value: 7.8340000000000005
331
+ - type: precision_at_100
332
+ value: 1.269
333
+ - type: precision_at_1000
334
+ value: 0.178
335
+ - type: precision_at_3
336
+ value: 19.023
337
+ - type: precision_at_5
338
+ value: 13.248
339
+ - type: recall_at_1
340
+ value: 29.035
341
+ - type: recall_at_10
342
+ value: 51.06
343
+ - type: recall_at_100
344
+ value: 69.64099999999999
345
+ - type: recall_at_1000
346
+ value: 84.49
347
+ - type: recall_at_3
348
+ value: 41.333999999999996
349
+ - type: recall_at_5
350
+ value: 45.663
351
+ - task:
352
+ type: Retrieval
353
+ dataset:
354
+ type: BeIR/cqadupstack
355
+ name: MTEB CQADupstackGamingRetrieval
356
+ config: default
357
+ split: test
358
+ revision: None
359
+ metrics:
360
+ - type: map_at_1
361
+ value: 37.239
362
+ - type: map_at_10
363
+ value: 47.873
364
+ - type: map_at_100
365
+ value: 48.842999999999996
366
+ - type: map_at_1000
367
+ value: 48.913000000000004
368
+ - type: map_at_3
369
+ value: 45.050000000000004
370
+ - type: map_at_5
371
+ value: 46.498
372
+ - type: mrr_at_1
373
+ value: 42.508
374
+ - type: mrr_at_10
375
+ value: 51.44
376
+ - type: mrr_at_100
377
+ value: 52.087
378
+ - type: mrr_at_1000
379
+ value: 52.129999999999995
380
+ - type: mrr_at_3
381
+ value: 49.164
382
+ - type: mrr_at_5
383
+ value: 50.343
384
+ - type: ndcg_at_1
385
+ value: 42.508
386
+ - type: ndcg_at_10
387
+ value: 53.31399999999999
388
+ - type: ndcg_at_100
389
+ value: 57.245000000000005
390
+ - type: ndcg_at_1000
391
+ value: 58.794000000000004
392
+ - type: ndcg_at_3
393
+ value: 48.295
394
+ - type: ndcg_at_5
395
+ value: 50.415
396
+ - type: precision_at_1
397
+ value: 42.508
398
+ - type: precision_at_10
399
+ value: 8.458
400
+ - type: precision_at_100
401
+ value: 1.133
402
+ - type: precision_at_1000
403
+ value: 0.132
404
+ - type: precision_at_3
405
+ value: 21.191
406
+ - type: precision_at_5
407
+ value: 14.307
408
+ - type: recall_at_1
409
+ value: 37.239
410
+ - type: recall_at_10
411
+ value: 65.99000000000001
412
+ - type: recall_at_100
413
+ value: 82.99499999999999
414
+ - type: recall_at_1000
415
+ value: 94.128
416
+ - type: recall_at_3
417
+ value: 52.382
418
+ - type: recall_at_5
419
+ value: 57.648999999999994
420
+ - task:
421
+ type: Retrieval
422
+ dataset:
423
+ type: BeIR/cqadupstack
424
+ name: MTEB CQADupstackGisRetrieval
425
+ config: default
426
+ split: test
427
+ revision: None
428
+ metrics:
429
+ - type: map_at_1
430
+ value: 23.039
431
+ - type: map_at_10
432
+ value: 29.694
433
+ - type: map_at_100
434
+ value: 30.587999999999997
435
+ - type: map_at_1000
436
+ value: 30.692999999999998
437
+ - type: map_at_3
438
+ value: 27.708
439
+ - type: map_at_5
440
+ value: 28.774
441
+ - type: mrr_at_1
442
+ value: 24.633
443
+ - type: mrr_at_10
444
+ value: 31.478
445
+ - type: mrr_at_100
446
+ value: 32.299
447
+ - type: mrr_at_1000
448
+ value: 32.381
449
+ - type: mrr_at_3
450
+ value: 29.435
451
+ - type: mrr_at_5
452
+ value: 30.446
453
+ - type: ndcg_at_1
454
+ value: 24.633
455
+ - type: ndcg_at_10
456
+ value: 33.697
457
+ - type: ndcg_at_100
458
+ value: 38.080000000000005
459
+ - type: ndcg_at_1000
460
+ value: 40.812
461
+ - type: ndcg_at_3
462
+ value: 29.654000000000003
463
+ - type: ndcg_at_5
464
+ value: 31.474000000000004
465
+ - type: precision_at_1
466
+ value: 24.633
467
+ - type: precision_at_10
468
+ value: 5.0729999999999995
469
+ - type: precision_at_100
470
+ value: 0.753
471
+ - type: precision_at_1000
472
+ value: 0.10300000000000001
473
+ - type: precision_at_3
474
+ value: 12.279
475
+ - type: precision_at_5
476
+ value: 8.452
477
+ - type: recall_at_1
478
+ value: 23.039
479
+ - type: recall_at_10
480
+ value: 44.275999999999996
481
+ - type: recall_at_100
482
+ value: 64.4
483
+ - type: recall_at_1000
484
+ value: 85.135
485
+ - type: recall_at_3
486
+ value: 33.394
487
+ - type: recall_at_5
488
+ value: 37.687
489
+ - task:
490
+ type: Retrieval
491
+ dataset:
492
+ type: BeIR/cqadupstack
493
+ name: MTEB CQADupstackMathematicaRetrieval
494
+ config: default
495
+ split: test
496
+ revision: None
497
+ metrics:
498
+ - type: map_at_1
499
+ value: 13.594999999999999
500
+ - type: map_at_10
501
+ value: 19.933999999999997
502
+ - type: map_at_100
503
+ value: 20.966
504
+ - type: map_at_1000
505
+ value: 21.087
506
+ - type: map_at_3
507
+ value: 17.749000000000002
508
+ - type: map_at_5
509
+ value: 19.156000000000002
510
+ - type: mrr_at_1
511
+ value: 17.662
512
+ - type: mrr_at_10
513
+ value: 24.407
514
+ - type: mrr_at_100
515
+ value: 25.385
516
+ - type: mrr_at_1000
517
+ value: 25.465
518
+ - type: mrr_at_3
519
+ value: 22.056
520
+ - type: mrr_at_5
521
+ value: 23.630000000000003
522
+ - type: ndcg_at_1
523
+ value: 17.662
524
+ - type: ndcg_at_10
525
+ value: 24.391
526
+ - type: ndcg_at_100
527
+ value: 29.681
528
+ - type: ndcg_at_1000
529
+ value: 32.923
530
+ - type: ndcg_at_3
531
+ value: 20.271
532
+ - type: ndcg_at_5
533
+ value: 22.621
534
+ - type: precision_at_1
535
+ value: 17.662
536
+ - type: precision_at_10
537
+ value: 4.44
538
+ - type: precision_at_100
539
+ value: 0.8200000000000001
540
+ - type: precision_at_1000
541
+ value: 0.125
542
+ - type: precision_at_3
543
+ value: 9.577
544
+ - type: precision_at_5
545
+ value: 7.313
546
+ - type: recall_at_1
547
+ value: 13.594999999999999
548
+ - type: recall_at_10
549
+ value: 33.976
550
+ - type: recall_at_100
551
+ value: 57.43000000000001
552
+ - type: recall_at_1000
553
+ value: 80.958
554
+ - type: recall_at_3
555
+ value: 22.897000000000002
556
+ - type: recall_at_5
557
+ value: 28.714000000000002
558
+ - task:
559
+ type: Retrieval
560
+ dataset:
561
+ type: BeIR/cqadupstack
562
+ name: MTEB CQADupstackPhysicsRetrieval
563
+ config: default
564
+ split: test
565
+ revision: None
566
+ metrics:
567
+ - type: map_at_1
568
+ value: 26.683
569
+ - type: map_at_10
570
+ value: 35.068
571
+ - type: map_at_100
572
+ value: 36.311
573
+ - type: map_at_1000
574
+ value: 36.436
575
+ - type: map_at_3
576
+ value: 32.371
577
+ - type: map_at_5
578
+ value: 33.761
579
+ - type: mrr_at_1
580
+ value: 32.435
581
+ - type: mrr_at_10
582
+ value: 40.721000000000004
583
+ - type: mrr_at_100
584
+ value: 41.535
585
+ - type: mrr_at_1000
586
+ value: 41.593
587
+ - type: mrr_at_3
588
+ value: 38.401999999999994
589
+ - type: mrr_at_5
590
+ value: 39.567
591
+ - type: ndcg_at_1
592
+ value: 32.435
593
+ - type: ndcg_at_10
594
+ value: 40.538000000000004
595
+ - type: ndcg_at_100
596
+ value: 45.963
597
+ - type: ndcg_at_1000
598
+ value: 48.400999999999996
599
+ - type: ndcg_at_3
600
+ value: 36.048
601
+ - type: ndcg_at_5
602
+ value: 37.899
603
+ - type: precision_at_1
604
+ value: 32.435
605
+ - type: precision_at_10
606
+ value: 7.1129999999999995
607
+ - type: precision_at_100
608
+ value: 1.162
609
+ - type: precision_at_1000
610
+ value: 0.156
611
+ - type: precision_at_3
612
+ value: 16.683
613
+ - type: precision_at_5
614
+ value: 11.684
615
+ - type: recall_at_1
616
+ value: 26.683
617
+ - type: recall_at_10
618
+ value: 51.517
619
+ - type: recall_at_100
620
+ value: 74.553
621
+ - type: recall_at_1000
622
+ value: 90.649
623
+ - type: recall_at_3
624
+ value: 38.495000000000005
625
+ - type: recall_at_5
626
+ value: 43.495
627
+ - task:
628
+ type: Retrieval
629
+ dataset:
630
+ type: BeIR/cqadupstack
631
+ name: MTEB CQADupstackProgrammersRetrieval
632
+ config: default
633
+ split: test
634
+ revision: None
635
+ metrics:
636
+ - type: map_at_1
637
+ value: 24.186
638
+ - type: map_at_10
639
+ value: 31.972
640
+ - type: map_at_100
641
+ value: 33.117000000000004
642
+ - type: map_at_1000
643
+ value: 33.243
644
+ - type: map_at_3
645
+ value: 29.423
646
+ - type: map_at_5
647
+ value: 30.847
648
+ - type: mrr_at_1
649
+ value: 29.794999999999998
650
+ - type: mrr_at_10
651
+ value: 36.767
652
+ - type: mrr_at_100
653
+ value: 37.645
654
+ - type: mrr_at_1000
655
+ value: 37.716
656
+ - type: mrr_at_3
657
+ value: 34.513
658
+ - type: mrr_at_5
659
+ value: 35.791000000000004
660
+ - type: ndcg_at_1
661
+ value: 29.794999999999998
662
+ - type: ndcg_at_10
663
+ value: 36.786
664
+ - type: ndcg_at_100
665
+ value: 41.94
666
+ - type: ndcg_at_1000
667
+ value: 44.830999999999996
668
+ - type: ndcg_at_3
669
+ value: 32.504
670
+ - type: ndcg_at_5
671
+ value: 34.404
672
+ - type: precision_at_1
673
+ value: 29.794999999999998
674
+ - type: precision_at_10
675
+ value: 6.518
676
+ - type: precision_at_100
677
+ value: 1.0659999999999998
678
+ - type: precision_at_1000
679
+ value: 0.149
680
+ - type: precision_at_3
681
+ value: 15.296999999999999
682
+ - type: precision_at_5
683
+ value: 10.731
684
+ - type: recall_at_1
685
+ value: 24.186
686
+ - type: recall_at_10
687
+ value: 46.617
688
+ - type: recall_at_100
689
+ value: 68.75
690
+ - type: recall_at_1000
691
+ value: 88.864
692
+ - type: recall_at_3
693
+ value: 34.199
694
+ - type: recall_at_5
695
+ value: 39.462
696
+ - task:
697
+ type: Retrieval
698
+ dataset:
699
+ type: BeIR/cqadupstack
700
+ name: MTEB CQADupstackRetrieval
701
+ config: default
702
+ split: test
703
+ revision: None
704
+ metrics:
705
+ - type: map_at_1
706
+ value: 24.22083333333333
707
+ - type: map_at_10
708
+ value: 31.606666666666662
709
+ - type: map_at_100
710
+ value: 32.6195
711
+ - type: map_at_1000
712
+ value: 32.739999999999995
713
+ - type: map_at_3
714
+ value: 29.37825
715
+ - type: map_at_5
716
+ value: 30.596083333333336
717
+ - type: mrr_at_1
718
+ value: 28.607916666666668
719
+ - type: mrr_at_10
720
+ value: 35.54591666666666
721
+ - type: mrr_at_100
722
+ value: 36.33683333333333
723
+ - type: mrr_at_1000
724
+ value: 36.40624999999999
725
+ - type: mrr_at_3
726
+ value: 33.526250000000005
727
+ - type: mrr_at_5
728
+ value: 34.6605
729
+ - type: ndcg_at_1
730
+ value: 28.607916666666668
731
+ - type: ndcg_at_10
732
+ value: 36.07966666666667
733
+ - type: ndcg_at_100
734
+ value: 40.73308333333333
735
+ - type: ndcg_at_1000
736
+ value: 43.40666666666666
737
+ - type: ndcg_at_3
738
+ value: 32.23525
739
+ - type: ndcg_at_5
740
+ value: 33.97083333333333
741
+ - type: precision_at_1
742
+ value: 28.607916666666668
743
+ - type: precision_at_10
744
+ value: 6.120333333333335
745
+ - type: precision_at_100
746
+ value: 0.9921666666666668
747
+ - type: precision_at_1000
748
+ value: 0.14091666666666666
749
+ - type: precision_at_3
750
+ value: 14.54975
751
+ - type: precision_at_5
752
+ value: 10.153166666666667
753
+ - type: recall_at_1
754
+ value: 24.22083333333333
755
+ - type: recall_at_10
756
+ value: 45.49183333333334
757
+ - type: recall_at_100
758
+ value: 66.28133333333332
759
+ - type: recall_at_1000
760
+ value: 85.16541666666667
761
+ - type: recall_at_3
762
+ value: 34.6485
763
+ - type: recall_at_5
764
+ value: 39.229749999999996
765
+ - task:
766
+ type: Retrieval
767
+ dataset:
768
+ type: BeIR/cqadupstack
769
+ name: MTEB CQADupstackStatsRetrieval
770
+ config: default
771
+ split: test
772
+ revision: None
773
+ metrics:
774
+ - type: map_at_1
775
+ value: 21.842
776
+ - type: map_at_10
777
+ value: 27.573999999999998
778
+ - type: map_at_100
779
+ value: 28.410999999999998
780
+ - type: map_at_1000
781
+ value: 28.502
782
+ - type: map_at_3
783
+ value: 25.921
784
+ - type: map_at_5
785
+ value: 26.888
786
+ - type: mrr_at_1
787
+ value: 24.08
788
+ - type: mrr_at_10
789
+ value: 29.915999999999997
790
+ - type: mrr_at_100
791
+ value: 30.669
792
+ - type: mrr_at_1000
793
+ value: 30.746000000000002
794
+ - type: mrr_at_3
795
+ value: 28.349000000000004
796
+ - type: mrr_at_5
797
+ value: 29.246
798
+ - type: ndcg_at_1
799
+ value: 24.08
800
+ - type: ndcg_at_10
801
+ value: 30.898999999999997
802
+ - type: ndcg_at_100
803
+ value: 35.272999999999996
804
+ - type: ndcg_at_1000
805
+ value: 37.679
806
+ - type: ndcg_at_3
807
+ value: 27.881
808
+ - type: ndcg_at_5
809
+ value: 29.432000000000002
810
+ - type: precision_at_1
811
+ value: 24.08
812
+ - type: precision_at_10
813
+ value: 4.678
814
+ - type: precision_at_100
815
+ value: 0.744
816
+ - type: precision_at_1000
817
+ value: 0.10300000000000001
818
+ - type: precision_at_3
819
+ value: 11.860999999999999
820
+ - type: precision_at_5
821
+ value: 8.16
822
+ - type: recall_at_1
823
+ value: 21.842
824
+ - type: recall_at_10
825
+ value: 38.66
826
+ - type: recall_at_100
827
+ value: 59.169000000000004
828
+ - type: recall_at_1000
829
+ value: 76.887
830
+ - type: recall_at_3
831
+ value: 30.532999999999998
832
+ - type: recall_at_5
833
+ value: 34.354
834
+ - task:
835
+ type: Retrieval
836
+ dataset:
837
+ type: BeIR/cqadupstack
838
+ name: MTEB CQADupstackTexRetrieval
839
+ config: default
840
+ split: test
841
+ revision: None
842
+ metrics:
843
+ - type: map_at_1
844
+ value: 17.145
845
+ - type: map_at_10
846
+ value: 22.729
847
+ - type: map_at_100
848
+ value: 23.574
849
+ - type: map_at_1000
850
+ value: 23.695
851
+ - type: map_at_3
852
+ value: 21.044
853
+ - type: map_at_5
854
+ value: 21.981
855
+ - type: mrr_at_1
856
+ value: 20.888
857
+ - type: mrr_at_10
858
+ value: 26.529000000000003
859
+ - type: mrr_at_100
860
+ value: 27.308
861
+ - type: mrr_at_1000
862
+ value: 27.389000000000003
863
+ - type: mrr_at_3
864
+ value: 24.868000000000002
865
+ - type: mrr_at_5
866
+ value: 25.825
867
+ - type: ndcg_at_1
868
+ value: 20.888
869
+ - type: ndcg_at_10
870
+ value: 26.457000000000004
871
+ - type: ndcg_at_100
872
+ value: 30.764000000000003
873
+ - type: ndcg_at_1000
874
+ value: 33.825
875
+ - type: ndcg_at_3
876
+ value: 23.483999999999998
877
+ - type: ndcg_at_5
878
+ value: 24.836
879
+ - type: precision_at_1
880
+ value: 20.888
881
+ - type: precision_at_10
882
+ value: 4.58
883
+ - type: precision_at_100
884
+ value: 0.784
885
+ - type: precision_at_1000
886
+ value: 0.121
887
+ - type: precision_at_3
888
+ value: 10.874
889
+ - type: precision_at_5
890
+ value: 7.639
891
+ - type: recall_at_1
892
+ value: 17.145
893
+ - type: recall_at_10
894
+ value: 33.938
895
+ - type: recall_at_100
896
+ value: 53.672
897
+ - type: recall_at_1000
898
+ value: 76.023
899
+ - type: recall_at_3
900
+ value: 25.363000000000003
901
+ - type: recall_at_5
902
+ value: 29.023
903
+ - task:
904
+ type: Retrieval
905
+ dataset:
906
+ type: BeIR/cqadupstack
907
+ name: MTEB CQADupstackUnixRetrieval
908
+ config: default
909
+ split: test
910
+ revision: None
911
+ metrics:
912
+ - type: map_at_1
913
+ value: 24.275
914
+ - type: map_at_10
915
+ value: 30.438
916
+ - type: map_at_100
917
+ value: 31.489
918
+ - type: map_at_1000
919
+ value: 31.601000000000003
920
+ - type: map_at_3
921
+ value: 28.647
922
+ - type: map_at_5
923
+ value: 29.660999999999998
924
+ - type: mrr_at_1
925
+ value: 28.077999999999996
926
+ - type: mrr_at_10
927
+ value: 34.098
928
+ - type: mrr_at_100
929
+ value: 35.025
930
+ - type: mrr_at_1000
931
+ value: 35.109
932
+ - type: mrr_at_3
933
+ value: 32.4
934
+ - type: mrr_at_5
935
+ value: 33.379999999999995
936
+ - type: ndcg_at_1
937
+ value: 28.077999999999996
938
+ - type: ndcg_at_10
939
+ value: 34.271
940
+ - type: ndcg_at_100
941
+ value: 39.352
942
+ - type: ndcg_at_1000
943
+ value: 42.199
944
+ - type: ndcg_at_3
945
+ value: 30.978
946
+ - type: ndcg_at_5
947
+ value: 32.498
948
+ - type: precision_at_1
949
+ value: 28.077999999999996
950
+ - type: precision_at_10
951
+ value: 5.345
952
+ - type: precision_at_100
953
+ value: 0.897
954
+ - type: precision_at_1000
955
+ value: 0.125
956
+ - type: precision_at_3
957
+ value: 13.526
958
+ - type: precision_at_5
959
+ value: 9.16
960
+ - type: recall_at_1
961
+ value: 24.275
962
+ - type: recall_at_10
963
+ value: 42.362
964
+ - type: recall_at_100
965
+ value: 64.461
966
+ - type: recall_at_1000
967
+ value: 84.981
968
+ - type: recall_at_3
969
+ value: 33.249
970
+ - type: recall_at_5
971
+ value: 37.214999999999996
972
+ - task:
973
+ type: Retrieval
974
+ dataset:
975
+ type: BeIR/cqadupstack
976
+ name: MTEB CQADupstackWebmastersRetrieval
977
+ config: default
978
+ split: test
979
+ revision: None
980
+ metrics:
981
+ - type: map_at_1
982
+ value: 22.358
983
+ - type: map_at_10
984
+ value: 30.062
985
+ - type: map_at_100
986
+ value: 31.189
987
+ - type: map_at_1000
988
+ value: 31.386999999999997
989
+ - type: map_at_3
990
+ value: 27.672
991
+ - type: map_at_5
992
+ value: 28.76
993
+ - type: mrr_at_1
994
+ value: 26.877000000000002
995
+ - type: mrr_at_10
996
+ value: 33.948
997
+ - type: mrr_at_100
998
+ value: 34.746
999
+ - type: mrr_at_1000
1000
+ value: 34.816
1001
+ - type: mrr_at_3
1002
+ value: 31.884
1003
+ - type: mrr_at_5
1004
+ value: 33.001000000000005
1005
+ - type: ndcg_at_1
1006
+ value: 26.877000000000002
1007
+ - type: ndcg_at_10
1008
+ value: 34.977000000000004
1009
+ - type: ndcg_at_100
1010
+ value: 39.753
1011
+ - type: ndcg_at_1000
1012
+ value: 42.866
1013
+ - type: ndcg_at_3
1014
+ value: 30.956
1015
+ - type: ndcg_at_5
1016
+ value: 32.381
1017
+ - type: precision_at_1
1018
+ value: 26.877000000000002
1019
+ - type: precision_at_10
1020
+ value: 6.7
1021
+ - type: precision_at_100
1022
+ value: 1.287
1023
+ - type: precision_at_1000
1024
+ value: 0.215
1025
+ - type: precision_at_3
1026
+ value: 14.360999999999999
1027
+ - type: precision_at_5
1028
+ value: 10.119
1029
+ - type: recall_at_1
1030
+ value: 22.358
1031
+ - type: recall_at_10
1032
+ value: 44.183
1033
+ - type: recall_at_100
1034
+ value: 67.14
1035
+ - type: recall_at_1000
1036
+ value: 87.53999999999999
1037
+ - type: recall_at_3
1038
+ value: 32.79
1039
+ - type: recall_at_5
1040
+ value: 36.829
1041
+ - task:
1042
+ type: Retrieval
1043
+ dataset:
1044
+ type: BeIR/cqadupstack
1045
+ name: MTEB CQADupstackWordpressRetrieval
1046
+ config: default
1047
+ split: test
1048
+ revision: None
1049
+ metrics:
1050
+ - type: map_at_1
1051
+ value: 19.198999999999998
1052
+ - type: map_at_10
1053
+ value: 25.229000000000003
1054
+ - type: map_at_100
1055
+ value: 26.003
1056
+ - type: map_at_1000
1057
+ value: 26.111
1058
+ - type: map_at_3
1059
+ value: 23.442
1060
+ - type: map_at_5
1061
+ value: 24.343
1062
+ - type: mrr_at_1
1063
+ value: 21.072
1064
+ - type: mrr_at_10
1065
+ value: 27.02
1066
+ - type: mrr_at_100
1067
+ value: 27.735
1068
+ - type: mrr_at_1000
1069
+ value: 27.815
1070
+ - type: mrr_at_3
1071
+ value: 25.416
1072
+ - type: mrr_at_5
1073
+ value: 26.173999999999996
1074
+ - type: ndcg_at_1
1075
+ value: 21.072
1076
+ - type: ndcg_at_10
1077
+ value: 28.862
1078
+ - type: ndcg_at_100
1079
+ value: 33.043
1080
+ - type: ndcg_at_1000
1081
+ value: 36.003
1082
+ - type: ndcg_at_3
1083
+ value: 25.35
1084
+ - type: ndcg_at_5
1085
+ value: 26.773000000000003
1086
+ - type: precision_at_1
1087
+ value: 21.072
1088
+ - type: precision_at_10
1089
+ value: 4.436
1090
+ - type: precision_at_100
1091
+ value: 0.713
1092
+ - type: precision_at_1000
1093
+ value: 0.106
1094
+ - type: precision_at_3
1095
+ value: 10.659
1096
+ - type: precision_at_5
1097
+ value: 7.32
1098
+ - type: recall_at_1
1099
+ value: 19.198999999999998
1100
+ - type: recall_at_10
1101
+ value: 38.376
1102
+ - type: recall_at_100
1103
+ value: 58.36900000000001
1104
+ - type: recall_at_1000
1105
+ value: 80.92099999999999
1106
+ - type: recall_at_3
1107
+ value: 28.715000000000003
1108
+ - type: recall_at_5
1109
+ value: 32.147
1110
+ - task:
1111
+ type: Retrieval
1112
+ dataset:
1113
+ type: climate-fever
1114
+ name: MTEB ClimateFEVER
1115
+ config: default
1116
+ split: test
1117
+ revision: None
1118
+ metrics:
1119
+ - type: map_at_1
1120
+ value: 5.9319999999999995
1121
+ - type: map_at_10
1122
+ value: 10.483
1123
+ - type: map_at_100
1124
+ value: 11.97
1125
+ - type: map_at_1000
1126
+ value: 12.171999999999999
1127
+ - type: map_at_3
1128
+ value: 8.477
1129
+ - type: map_at_5
1130
+ value: 9.495000000000001
1131
+ - type: mrr_at_1
1132
+ value: 13.094
1133
+ - type: mrr_at_10
1134
+ value: 21.282
1135
+ - type: mrr_at_100
1136
+ value: 22.556
1137
+ - type: mrr_at_1000
1138
+ value: 22.628999999999998
1139
+ - type: mrr_at_3
1140
+ value: 18.218999999999998
1141
+ - type: mrr_at_5
1142
+ value: 19.900000000000002
1143
+ - type: ndcg_at_1
1144
+ value: 13.094
1145
+ - type: ndcg_at_10
1146
+ value: 15.811
1147
+ - type: ndcg_at_100
1148
+ value: 23.035
1149
+ - type: ndcg_at_1000
1150
+ value: 27.089999999999996
1151
+ - type: ndcg_at_3
1152
+ value: 11.905000000000001
1153
+ - type: ndcg_at_5
1154
+ value: 13.377
1155
+ - type: precision_at_1
1156
+ value: 13.094
1157
+ - type: precision_at_10
1158
+ value: 5.225
1159
+ - type: precision_at_100
1160
+ value: 1.2970000000000002
1161
+ - type: precision_at_1000
1162
+ value: 0.203
1163
+ - type: precision_at_3
1164
+ value: 8.86
1165
+ - type: precision_at_5
1166
+ value: 7.309
1167
+ - type: recall_at_1
1168
+ value: 5.9319999999999995
1169
+ - type: recall_at_10
1170
+ value: 20.305
1171
+ - type: recall_at_100
1172
+ value: 46.314
1173
+ - type: recall_at_1000
1174
+ value: 69.612
1175
+ - type: recall_at_3
1176
+ value: 11.21
1177
+ - type: recall_at_5
1178
+ value: 14.773
1179
+ - task:
1180
+ type: Retrieval
1181
+ dataset:
1182
+ type: dbpedia-entity
1183
+ name: MTEB DBPedia
1184
+ config: default
1185
+ split: test
1186
+ revision: None
1187
+ metrics:
1188
+ - type: map_at_1
1189
+ value: 8.674
1190
+ - type: map_at_10
1191
+ value: 17.822
1192
+ - type: map_at_100
1193
+ value: 24.794
1194
+ - type: map_at_1000
1195
+ value: 26.214
1196
+ - type: map_at_3
1197
+ value: 12.690999999999999
1198
+ - type: map_at_5
1199
+ value: 15.033
1200
+ - type: mrr_at_1
1201
+ value: 61.75000000000001
1202
+ - type: mrr_at_10
1203
+ value: 71.58
1204
+ - type: mrr_at_100
1205
+ value: 71.923
1206
+ - type: mrr_at_1000
1207
+ value: 71.932
1208
+ - type: mrr_at_3
1209
+ value: 70.125
1210
+ - type: mrr_at_5
1211
+ value: 71.038
1212
+ - type: ndcg_at_1
1213
+ value: 51
1214
+ - type: ndcg_at_10
1215
+ value: 38.637
1216
+ - type: ndcg_at_100
1217
+ value: 42.398
1218
+ - type: ndcg_at_1000
1219
+ value: 48.962
1220
+ - type: ndcg_at_3
1221
+ value: 43.29
1222
+ - type: ndcg_at_5
1223
+ value: 40.763
1224
+ - type: precision_at_1
1225
+ value: 61.75000000000001
1226
+ - type: precision_at_10
1227
+ value: 30.125
1228
+ - type: precision_at_100
1229
+ value: 9.53
1230
+ - type: precision_at_1000
1231
+ value: 1.9619999999999997
1232
+ - type: precision_at_3
1233
+ value: 45.583
1234
+ - type: precision_at_5
1235
+ value: 38.95
1236
+ - type: recall_at_1
1237
+ value: 8.674
1238
+ - type: recall_at_10
1239
+ value: 23.122
1240
+ - type: recall_at_100
1241
+ value: 47.46
1242
+ - type: recall_at_1000
1243
+ value: 67.662
1244
+ - type: recall_at_3
1245
+ value: 13.946
1246
+ - type: recall_at_5
1247
+ value: 17.768
1248
+ - task:
1249
+ type: Classification
1250
+ dataset:
1251
+ type: mteb/emotion
1252
+ name: MTEB EmotionClassification
1253
+ config: default
1254
+ split: test
1255
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
1256
+ metrics:
1257
+ - type: accuracy
1258
+ value: 46.86000000000001
1259
+ - type: f1
1260
+ value: 41.343580452760776
1261
+ - task:
1262
+ type: Retrieval
1263
+ dataset:
1264
+ type: fever
1265
+ name: MTEB FEVER
1266
+ config: default
1267
+ split: test
1268
+ revision: None
1269
+ metrics:
1270
+ - type: map_at_1
1271
+ value: 36.609
1272
+ - type: map_at_10
1273
+ value: 47.552
1274
+ - type: map_at_100
1275
+ value: 48.283
1276
+ - type: map_at_1000
1277
+ value: 48.321
1278
+ - type: map_at_3
1279
+ value: 44.869
1280
+ - type: map_at_5
1281
+ value: 46.509
1282
+ - type: mrr_at_1
1283
+ value: 39.214
1284
+ - type: mrr_at_10
1285
+ value: 50.434999999999995
1286
+ - type: mrr_at_100
1287
+ value: 51.122
1288
+ - type: mrr_at_1000
1289
+ value: 51.151
1290
+ - type: mrr_at_3
1291
+ value: 47.735
1292
+ - type: mrr_at_5
1293
+ value: 49.394
1294
+ - type: ndcg_at_1
1295
+ value: 39.214
1296
+ - type: ndcg_at_10
1297
+ value: 53.52400000000001
1298
+ - type: ndcg_at_100
1299
+ value: 56.997
1300
+ - type: ndcg_at_1000
1301
+ value: 57.975
1302
+ - type: ndcg_at_3
1303
+ value: 48.173
1304
+ - type: ndcg_at_5
1305
+ value: 51.05800000000001
1306
+ - type: precision_at_1
1307
+ value: 39.214
1308
+ - type: precision_at_10
1309
+ value: 7.573
1310
+ - type: precision_at_100
1311
+ value: 0.9440000000000001
1312
+ - type: precision_at_1000
1313
+ value: 0.104
1314
+ - type: precision_at_3
1315
+ value: 19.782
1316
+ - type: precision_at_5
1317
+ value: 13.453000000000001
1318
+ - type: recall_at_1
1319
+ value: 36.609
1320
+ - type: recall_at_10
1321
+ value: 69.247
1322
+ - type: recall_at_100
1323
+ value: 84.99600000000001
1324
+ - type: recall_at_1000
1325
+ value: 92.40899999999999
1326
+ - type: recall_at_3
1327
+ value: 54.856
1328
+ - type: recall_at_5
1329
+ value: 61.797000000000004
1330
+ - task:
1331
+ type: Retrieval
1332
+ dataset:
1333
+ type: fiqa
1334
+ name: MTEB FiQA2018
1335
+ config: default
1336
+ split: test
1337
+ revision: None
1338
+ metrics:
1339
+ - type: map_at_1
1340
+ value: 16.466
1341
+ - type: map_at_10
1342
+ value: 27.060000000000002
1343
+ - type: map_at_100
1344
+ value: 28.511999999999997
1345
+ - type: map_at_1000
1346
+ value: 28.693
1347
+ - type: map_at_3
1348
+ value: 22.777
1349
+ - type: map_at_5
1350
+ value: 25.086000000000002
1351
+ - type: mrr_at_1
1352
+ value: 32.716
1353
+ - type: mrr_at_10
1354
+ value: 41.593999999999994
1355
+ - type: mrr_at_100
1356
+ value: 42.370000000000005
1357
+ - type: mrr_at_1000
1358
+ value: 42.419000000000004
1359
+ - type: mrr_at_3
1360
+ value: 38.143
1361
+ - type: mrr_at_5
1362
+ value: 40.288000000000004
1363
+ - type: ndcg_at_1
1364
+ value: 32.716
1365
+ - type: ndcg_at_10
1366
+ value: 34.795
1367
+ - type: ndcg_at_100
1368
+ value: 40.58
1369
+ - type: ndcg_at_1000
1370
+ value: 43.993
1371
+ - type: ndcg_at_3
1372
+ value: 29.573
1373
+ - type: ndcg_at_5
1374
+ value: 31.583
1375
+ - type: precision_at_1
1376
+ value: 32.716
1377
+ - type: precision_at_10
1378
+ value: 9.937999999999999
1379
+ - type: precision_at_100
1380
+ value: 1.585
1381
+ - type: precision_at_1000
1382
+ value: 0.22
1383
+ - type: precision_at_3
1384
+ value: 19.496
1385
+ - type: precision_at_5
1386
+ value: 15.247
1387
+ - type: recall_at_1
1388
+ value: 16.466
1389
+ - type: recall_at_10
1390
+ value: 42.886
1391
+ - type: recall_at_100
1392
+ value: 64.724
1393
+ - type: recall_at_1000
1394
+ value: 85.347
1395
+ - type: recall_at_3
1396
+ value: 26.765
1397
+ - type: recall_at_5
1398
+ value: 33.603
1399
+ - task:
1400
+ type: Retrieval
1401
+ dataset:
1402
+ type: hotpotqa
1403
+ name: MTEB HotpotQA
1404
+ config: default
1405
+ split: test
1406
+ revision: None
1407
+ metrics:
1408
+ - type: map_at_1
1409
+ value: 33.025
1410
+ - type: map_at_10
1411
+ value: 47.343
1412
+ - type: map_at_100
1413
+ value: 48.207
1414
+ - type: map_at_1000
1415
+ value: 48.281
1416
+ - type: map_at_3
1417
+ value: 44.519
1418
+ - type: map_at_5
1419
+ value: 46.217000000000006
1420
+ - type: mrr_at_1
1421
+ value: 66.05
1422
+ - type: mrr_at_10
1423
+ value: 72.94699999999999
1424
+ - type: mrr_at_100
1425
+ value: 73.289
1426
+ - type: mrr_at_1000
1427
+ value: 73.30499999999999
1428
+ - type: mrr_at_3
1429
+ value: 71.686
1430
+ - type: mrr_at_5
1431
+ value: 72.491
1432
+ - type: ndcg_at_1
1433
+ value: 66.05
1434
+ - type: ndcg_at_10
1435
+ value: 56.338
1436
+ - type: ndcg_at_100
1437
+ value: 59.599999999999994
1438
+ - type: ndcg_at_1000
1439
+ value: 61.138000000000005
1440
+ - type: ndcg_at_3
1441
+ value: 52.034000000000006
1442
+ - type: ndcg_at_5
1443
+ value: 54.352000000000004
1444
+ - type: precision_at_1
1445
+ value: 66.05
1446
+ - type: precision_at_10
1447
+ value: 11.693000000000001
1448
+ - type: precision_at_100
1449
+ value: 1.425
1450
+ - type: precision_at_1000
1451
+ value: 0.163
1452
+ - type: precision_at_3
1453
+ value: 32.613
1454
+ - type: precision_at_5
1455
+ value: 21.401999999999997
1456
+ - type: recall_at_1
1457
+ value: 33.025
1458
+ - type: recall_at_10
1459
+ value: 58.467
1460
+ - type: recall_at_100
1461
+ value: 71.242
1462
+ - type: recall_at_1000
1463
+ value: 81.452
1464
+ - type: recall_at_3
1465
+ value: 48.92
1466
+ - type: recall_at_5
1467
+ value: 53.504
1468
+ - task:
1469
+ type: Classification
1470
+ dataset:
1471
+ type: mteb/imdb
1472
+ name: MTEB ImdbClassification
1473
+ config: default
1474
+ split: test
1475
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
1476
+ metrics:
1477
+ - type: accuracy
1478
+ value: 75.5492
1479
+ - type: ap
1480
+ value: 69.42911637216271
1481
+ - type: f1
1482
+ value: 75.39113704261024
1483
+ - task:
1484
+ type: Retrieval
1485
+ dataset:
1486
+ type: msmarco
1487
+ name: MTEB MSMARCO
1488
+ config: default
1489
+ split: dev
1490
+ revision: None
1491
+ metrics:
1492
+ - type: map_at_1
1493
+ value: 23.173
1494
+ - type: map_at_10
1495
+ value: 35.453
1496
+ - type: map_at_100
1497
+ value: 36.573
1498
+ - type: map_at_1000
1499
+ value: 36.620999999999995
1500
+ - type: map_at_3
1501
+ value: 31.655
1502
+ - type: map_at_5
1503
+ value: 33.823
1504
+ - type: mrr_at_1
1505
+ value: 23.868000000000002
1506
+ - type: mrr_at_10
1507
+ value: 36.085
1508
+ - type: mrr_at_100
1509
+ value: 37.15
1510
+ - type: mrr_at_1000
1511
+ value: 37.193
1512
+ - type: mrr_at_3
1513
+ value: 32.376
1514
+ - type: mrr_at_5
1515
+ value: 34.501
1516
+ - type: ndcg_at_1
1517
+ value: 23.854
1518
+ - type: ndcg_at_10
1519
+ value: 42.33
1520
+ - type: ndcg_at_100
1521
+ value: 47.705999999999996
1522
+ - type: ndcg_at_1000
1523
+ value: 48.91
1524
+ - type: ndcg_at_3
1525
+ value: 34.604
1526
+ - type: ndcg_at_5
1527
+ value: 38.473
1528
+ - type: precision_at_1
1529
+ value: 23.854
1530
+ - type: precision_at_10
1531
+ value: 6.639
1532
+ - type: precision_at_100
1533
+ value: 0.932
1534
+ - type: precision_at_1000
1535
+ value: 0.104
1536
+ - type: precision_at_3
1537
+ value: 14.685
1538
+ - type: precision_at_5
1539
+ value: 10.782
1540
+ - type: recall_at_1
1541
+ value: 23.173
1542
+ - type: recall_at_10
1543
+ value: 63.441
1544
+ - type: recall_at_100
1545
+ value: 88.25
1546
+ - type: recall_at_1000
1547
+ value: 97.438
1548
+ - type: recall_at_3
1549
+ value: 42.434
1550
+ - type: recall_at_5
1551
+ value: 51.745
1552
+ - task:
1553
+ type: Classification
1554
+ dataset:
1555
+ type: mteb/mtop_domain
1556
+ name: MTEB MTOPDomainClassification (en)
1557
+ config: en
1558
+ split: test
1559
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
1560
+ metrics:
1561
+ - type: accuracy
1562
+ value: 92.05426356589147
1563
+ - type: f1
1564
+ value: 91.88068588063942
1565
+ - task:
1566
+ type: Classification
1567
+ dataset:
1568
+ type: mteb/mtop_intent
1569
+ name: MTEB MTOPIntentClassification (en)
1570
+ config: en
1571
+ split: test
1572
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
1573
+ metrics:
1574
+ - type: accuracy
1575
+ value: 73.23985408116735
1576
+ - type: f1
1577
+ value: 55.858906745287506
1578
+ - task:
1579
+ type: Classification
1580
+ dataset:
1581
+ type: mteb/amazon_massive_intent
1582
+ name: MTEB MassiveIntentClassification (en)
1583
+ config: en
1584
+ split: test
1585
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
1586
+ metrics:
1587
+ - type: accuracy
1588
+ value: 72.21923335574984
1589
+ - type: f1
1590
+ value: 70.0174116204253
1591
+ - task:
1592
+ type: Classification
1593
+ dataset:
1594
+ type: mteb/amazon_massive_scenario
1595
+ name: MTEB MassiveScenarioClassification (en)
1596
+ config: en
1597
+ split: test
1598
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
1599
+ metrics:
1600
+ - type: accuracy
1601
+ value: 75.77673167451245
1602
+ - type: f1
1603
+ value: 75.44811354778666
1604
+ - task:
1605
+ type: Clustering
1606
+ dataset:
1607
+ type: mteb/medrxiv-clustering-p2p
1608
+ name: MTEB MedrxivClusteringP2P
1609
+ config: default
1610
+ split: test
1611
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
1612
+ metrics:
1613
+ - type: v_measure
1614
+ value: 31.340414710728737
1615
+ - task:
1616
+ type: Clustering
1617
+ dataset:
1618
+ type: mteb/medrxiv-clustering-s2s
1619
+ name: MTEB MedrxivClusteringS2S
1620
+ config: default
1621
+ split: test
1622
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
1623
+ metrics:
1624
+ - type: v_measure
1625
+ value: 28.196676760061578
1626
+ - task:
1627
+ type: Reranking
1628
+ dataset:
1629
+ type: mteb/mind_small
1630
+ name: MTEB MindSmallReranking
1631
+ config: default
1632
+ split: test
1633
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
1634
+ metrics:
1635
+ - type: map
1636
+ value: 29.564149683482206
1637
+ - type: mrr
1638
+ value: 30.28995474250486
1639
+ - task:
1640
+ type: Retrieval
1641
+ dataset:
1642
+ type: nfcorpus
1643
+ name: MTEB NFCorpus
1644
+ config: default
1645
+ split: test
1646
+ revision: None
1647
+ metrics:
1648
+ - type: map_at_1
1649
+ value: 5.93
1650
+ - type: map_at_10
1651
+ value: 12.828000000000001
1652
+ - type: map_at_100
1653
+ value: 15.501000000000001
1654
+ - type: map_at_1000
1655
+ value: 16.791
1656
+ - type: map_at_3
1657
+ value: 9.727
1658
+ - type: map_at_5
1659
+ value: 11.318999999999999
1660
+ - type: mrr_at_1
1661
+ value: 47.678
1662
+ - type: mrr_at_10
1663
+ value: 55.893
1664
+ - type: mrr_at_100
1665
+ value: 56.491
1666
+ - type: mrr_at_1000
1667
+ value: 56.53
1668
+ - type: mrr_at_3
1669
+ value: 54.386
1670
+ - type: mrr_at_5
1671
+ value: 55.516
1672
+ - type: ndcg_at_1
1673
+ value: 45.975
1674
+ - type: ndcg_at_10
1675
+ value: 33.928999999999995
1676
+ - type: ndcg_at_100
1677
+ value: 30.164
1678
+ - type: ndcg_at_1000
1679
+ value: 38.756
1680
+ - type: ndcg_at_3
1681
+ value: 41.077000000000005
1682
+ - type: ndcg_at_5
1683
+ value: 38.415
1684
+ - type: precision_at_1
1685
+ value: 47.678
1686
+ - type: precision_at_10
1687
+ value: 24.365000000000002
1688
+ - type: precision_at_100
1689
+ value: 7.344
1690
+ - type: precision_at_1000
1691
+ value: 1.994
1692
+ - type: precision_at_3
1693
+ value: 38.184000000000005
1694
+ - type: precision_at_5
1695
+ value: 33.003
1696
+ - type: recall_at_1
1697
+ value: 5.93
1698
+ - type: recall_at_10
1699
+ value: 16.239
1700
+ - type: recall_at_100
1701
+ value: 28.782999999999998
1702
+ - type: recall_at_1000
1703
+ value: 60.11
1704
+ - type: recall_at_3
1705
+ value: 10.700999999999999
1706
+ - type: recall_at_5
1707
+ value: 13.584
1708
+ - task:
1709
+ type: Retrieval
1710
+ dataset:
1711
+ type: nq
1712
+ name: MTEB NQ
1713
+ config: default
1714
+ split: test
1715
+ revision: None
1716
+ metrics:
1717
+ - type: map_at_1
1718
+ value: 36.163000000000004
1719
+ - type: map_at_10
1720
+ value: 51.520999999999994
1721
+ - type: map_at_100
1722
+ value: 52.449
1723
+ - type: map_at_1000
1724
+ value: 52.473000000000006
1725
+ - type: map_at_3
1726
+ value: 47.666
1727
+ - type: map_at_5
1728
+ value: 50.043000000000006
1729
+ - type: mrr_at_1
1730
+ value: 40.266999999999996
1731
+ - type: mrr_at_10
1732
+ value: 54.074
1733
+ - type: mrr_at_100
1734
+ value: 54.722
1735
+ - type: mrr_at_1000
1736
+ value: 54.739000000000004
1737
+ - type: mrr_at_3
1738
+ value: 51.043000000000006
1739
+ - type: mrr_at_5
1740
+ value: 52.956
1741
+ - type: ndcg_at_1
1742
+ value: 40.238
1743
+ - type: ndcg_at_10
1744
+ value: 58.73199999999999
1745
+ - type: ndcg_at_100
1746
+ value: 62.470000000000006
1747
+ - type: ndcg_at_1000
1748
+ value: 63.083999999999996
1749
+ - type: ndcg_at_3
1750
+ value: 51.672
1751
+ - type: ndcg_at_5
1752
+ value: 55.564
1753
+ - type: precision_at_1
1754
+ value: 40.238
1755
+ - type: precision_at_10
1756
+ value: 9.279
1757
+ - type: precision_at_100
1758
+ value: 1.139
1759
+ - type: precision_at_1000
1760
+ value: 0.12
1761
+ - type: precision_at_3
1762
+ value: 23.078000000000003
1763
+ - type: precision_at_5
1764
+ value: 16.176
1765
+ - type: recall_at_1
1766
+ value: 36.163000000000004
1767
+ - type: recall_at_10
1768
+ value: 77.88199999999999
1769
+ - type: recall_at_100
1770
+ value: 93.83399999999999
1771
+ - type: recall_at_1000
1772
+ value: 98.465
1773
+ - type: recall_at_3
1774
+ value: 59.857000000000006
1775
+ - type: recall_at_5
1776
+ value: 68.73599999999999
1777
+ - task:
1778
+ type: Retrieval
1779
+ dataset:
1780
+ type: quora
1781
+ name: MTEB QuoraRetrieval
1782
+ config: default
1783
+ split: test
1784
+ revision: None
1785
+ metrics:
1786
+ - type: map_at_1
1787
+ value: 70.344
1788
+ - type: map_at_10
1789
+ value: 83.907
1790
+ - type: map_at_100
1791
+ value: 84.536
1792
+ - type: map_at_1000
1793
+ value: 84.557
1794
+ - type: map_at_3
1795
+ value: 80.984
1796
+ - type: map_at_5
1797
+ value: 82.844
1798
+ - type: mrr_at_1
1799
+ value: 81.02000000000001
1800
+ - type: mrr_at_10
1801
+ value: 87.158
1802
+ - type: mrr_at_100
1803
+ value: 87.268
1804
+ - type: mrr_at_1000
1805
+ value: 87.26899999999999
1806
+ - type: mrr_at_3
1807
+ value: 86.17
1808
+ - type: mrr_at_5
1809
+ value: 86.87
1810
+ - type: ndcg_at_1
1811
+ value: 81.02000000000001
1812
+ - type: ndcg_at_10
1813
+ value: 87.70700000000001
1814
+ - type: ndcg_at_100
1815
+ value: 89.004
1816
+ - type: ndcg_at_1000
1817
+ value: 89.139
1818
+ - type: ndcg_at_3
1819
+ value: 84.841
1820
+ - type: ndcg_at_5
1821
+ value: 86.455
1822
+ - type: precision_at_1
1823
+ value: 81.02000000000001
1824
+ - type: precision_at_10
1825
+ value: 13.248999999999999
1826
+ - type: precision_at_100
1827
+ value: 1.516
1828
+ - type: precision_at_1000
1829
+ value: 0.156
1830
+ - type: precision_at_3
1831
+ value: 36.963
1832
+ - type: precision_at_5
1833
+ value: 24.33
1834
+ - type: recall_at_1
1835
+ value: 70.344
1836
+ - type: recall_at_10
1837
+ value: 94.75099999999999
1838
+ - type: recall_at_100
1839
+ value: 99.30499999999999
1840
+ - type: recall_at_1000
1841
+ value: 99.928
1842
+ - type: recall_at_3
1843
+ value: 86.506
1844
+ - type: recall_at_5
1845
+ value: 91.083
1846
+ - task:
1847
+ type: Clustering
1848
+ dataset:
1849
+ type: mteb/reddit-clustering
1850
+ name: MTEB RedditClustering
1851
+ config: default
1852
+ split: test
1853
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
1854
+ metrics:
1855
+ - type: v_measure
1856
+ value: 42.873718018378305
1857
+ - task:
1858
+ type: Clustering
1859
+ dataset:
1860
+ type: mteb/reddit-clustering-p2p
1861
+ name: MTEB RedditClusteringP2P
1862
+ config: default
1863
+ split: test
1864
+ revision: 282350215ef01743dc01b456c7f5241fa8937f16
1865
+ metrics:
1866
+ - type: v_measure
1867
+ value: 56.39477366450528
1868
+ - task:
1869
+ type: Retrieval
1870
+ dataset:
1871
+ type: scidocs
1872
+ name: MTEB SCIDOCS
1873
+ config: default
1874
+ split: test
1875
+ revision: None
1876
+ metrics:
1877
+ - type: map_at_1
1878
+ value: 3.868
1879
+ - type: map_at_10
1880
+ value: 9.611
1881
+ - type: map_at_100
1882
+ value: 11.087
1883
+ - type: map_at_1000
1884
+ value: 11.332
1885
+ - type: map_at_3
1886
+ value: 6.813
1887
+ - type: map_at_5
1888
+ value: 8.233
1889
+ - type: mrr_at_1
1890
+ value: 19
1891
+ - type: mrr_at_10
1892
+ value: 28.457
1893
+ - type: mrr_at_100
1894
+ value: 29.613
1895
+ - type: mrr_at_1000
1896
+ value: 29.695
1897
+ - type: mrr_at_3
1898
+ value: 25.55
1899
+ - type: mrr_at_5
1900
+ value: 27.29
1901
+ - type: ndcg_at_1
1902
+ value: 19
1903
+ - type: ndcg_at_10
1904
+ value: 16.419
1905
+ - type: ndcg_at_100
1906
+ value: 22.817999999999998
1907
+ - type: ndcg_at_1000
1908
+ value: 27.72
1909
+ - type: ndcg_at_3
1910
+ value: 15.379000000000001
1911
+ - type: ndcg_at_5
1912
+ value: 13.645
1913
+ - type: precision_at_1
1914
+ value: 19
1915
+ - type: precision_at_10
1916
+ value: 8.540000000000001
1917
+ - type: precision_at_100
1918
+ value: 1.7819999999999998
1919
+ - type: precision_at_1000
1920
+ value: 0.297
1921
+ - type: precision_at_3
1922
+ value: 14.267
1923
+ - type: precision_at_5
1924
+ value: 12.04
1925
+ - type: recall_at_1
1926
+ value: 3.868
1927
+ - type: recall_at_10
1928
+ value: 17.288
1929
+ - type: recall_at_100
1930
+ value: 36.144999999999996
1931
+ - type: recall_at_1000
1932
+ value: 60.199999999999996
1933
+ - type: recall_at_3
1934
+ value: 8.688
1935
+ - type: recall_at_5
1936
+ value: 12.198
1937
+ - task:
1938
+ type: STS
1939
+ dataset:
1940
+ type: mteb/sickr-sts
1941
+ name: MTEB SICK-R
1942
+ config: default
1943
+ split: test
1944
+ revision: a6ea5a8cab320b040a23452cc28066d9beae2cee
1945
+ metrics:
1946
+ - type: cos_sim_pearson
1947
+ value: 83.96614722598582
1948
+ - type: cos_sim_spearman
1949
+ value: 78.9003023008781
1950
+ - type: euclidean_pearson
1951
+ value: 81.01829384436505
1952
+ - type: euclidean_spearman
1953
+ value: 78.93248416788914
1954
+ - type: manhattan_pearson
1955
+ value: 81.1665428926402
1956
+ - type: manhattan_spearman
1957
+ value: 78.93264116287453
1958
+ - task:
1959
+ type: STS
1960
+ dataset:
1961
+ type: mteb/sts12-sts
1962
+ name: MTEB STS12
1963
+ config: default
1964
+ split: test
1965
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
1966
+ metrics:
1967
+ - type: cos_sim_pearson
1968
+ value: 83.54613363895993
1969
+ - type: cos_sim_spearman
1970
+ value: 75.1883451602451
1971
+ - type: euclidean_pearson
1972
+ value: 79.70320886899894
1973
+ - type: euclidean_spearman
1974
+ value: 74.5917140136796
1975
+ - type: manhattan_pearson
1976
+ value: 79.82157067185999
1977
+ - type: manhattan_spearman
1978
+ value: 74.74185720594735
1979
+ - task:
1980
+ type: STS
1981
+ dataset:
1982
+ type: mteb/sts13-sts
1983
+ name: MTEB STS13
1984
+ config: default
1985
+ split: test
1986
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
1987
+ metrics:
1988
+ - type: cos_sim_pearson
1989
+ value: 81.30430156721782
1990
+ - type: cos_sim_spearman
1991
+ value: 81.79962989974364
1992
+ - type: euclidean_pearson
1993
+ value: 80.89058823224924
1994
+ - type: euclidean_spearman
1995
+ value: 81.35929372984597
1996
+ - type: manhattan_pearson
1997
+ value: 81.12204370487478
1998
+ - type: manhattan_spearman
1999
+ value: 81.6248963282232
2000
+ - task:
2001
+ type: STS
2002
+ dataset:
2003
+ type: mteb/sts14-sts
2004
+ name: MTEB STS14
2005
+ config: default
2006
+ split: test
2007
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
2008
+ metrics:
2009
+ - type: cos_sim_pearson
2010
+ value: 81.13064504403134
2011
+ - type: cos_sim_spearman
2012
+ value: 78.48371403924872
2013
+ - type: euclidean_pearson
2014
+ value: 80.16794919665591
2015
+ - type: euclidean_spearman
2016
+ value: 78.29216082221699
2017
+ - type: manhattan_pearson
2018
+ value: 80.22308565207301
2019
+ - type: manhattan_spearman
2020
+ value: 78.37829229948022
2021
+ - task:
2022
+ type: STS
2023
+ dataset:
2024
+ type: mteb/sts15-sts
2025
+ name: MTEB STS15
2026
+ config: default
2027
+ split: test
2028
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
2029
+ metrics:
2030
+ - type: cos_sim_pearson
2031
+ value: 86.52918899541099
2032
+ - type: cos_sim_spearman
2033
+ value: 87.49276894673142
2034
+ - type: euclidean_pearson
2035
+ value: 86.77440570164254
2036
+ - type: euclidean_spearman
2037
+ value: 87.5753295736756
2038
+ - type: manhattan_pearson
2039
+ value: 86.86098573892133
2040
+ - type: manhattan_spearman
2041
+ value: 87.65848591821947
2042
+ - task:
2043
+ type: STS
2044
+ dataset:
2045
+ type: mteb/sts16-sts
2046
+ name: MTEB STS16
2047
+ config: default
2048
+ split: test
2049
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
2050
+ metrics:
2051
+ - type: cos_sim_pearson
2052
+ value: 82.86805307244882
2053
+ - type: cos_sim_spearman
2054
+ value: 84.58066253757511
2055
+ - type: euclidean_pearson
2056
+ value: 84.38377000876991
2057
+ - type: euclidean_spearman
2058
+ value: 85.1837278784528
2059
+ - type: manhattan_pearson
2060
+ value: 84.41903291363842
2061
+ - type: manhattan_spearman
2062
+ value: 85.19023736251052
2063
+ - task:
2064
+ type: STS
2065
+ dataset:
2066
+ type: mteb/sts17-crosslingual-sts
2067
+ name: MTEB STS17 (en-en)
2068
+ config: en-en
2069
+ split: test
2070
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
2071
+ metrics:
2072
+ - type: cos_sim_pearson
2073
+ value: 86.77218560282436
2074
+ - type: cos_sim_spearman
2075
+ value: 87.94243515296604
2076
+ - type: euclidean_pearson
2077
+ value: 88.22800939214864
2078
+ - type: euclidean_spearman
2079
+ value: 87.91106839439841
2080
+ - type: manhattan_pearson
2081
+ value: 88.17063269848741
2082
+ - type: manhattan_spearman
2083
+ value: 87.72751904126062
2084
+ - task:
2085
+ type: STS
2086
+ dataset:
2087
+ type: mteb/sts22-crosslingual-sts
2088
+ name: MTEB STS22 (en)
2089
+ config: en
2090
+ split: test
2091
+ revision: 6d1ba47164174a496b7fa5d3569dae26a6813b80
2092
+ metrics:
2093
+ - type: cos_sim_pearson
2094
+ value: 60.40731554300387
2095
+ - type: cos_sim_spearman
2096
+ value: 63.76300532966479
2097
+ - type: euclidean_pearson
2098
+ value: 62.94727878229085
2099
+ - type: euclidean_spearman
2100
+ value: 63.678039531461216
2101
+ - type: manhattan_pearson
2102
+ value: 63.00661039863549
2103
+ - type: manhattan_spearman
2104
+ value: 63.6282591984376
2105
+ - task:
2106
+ type: STS
2107
+ dataset:
2108
+ type: mteb/stsbenchmark-sts
2109
+ name: MTEB STSBenchmark
2110
+ config: default
2111
+ split: test
2112
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
2113
+ metrics:
2114
+ - type: cos_sim_pearson
2115
+ value: 84.92731569745344
2116
+ - type: cos_sim_spearman
2117
+ value: 86.36336704300167
2118
+ - type: euclidean_pearson
2119
+ value: 86.09122224841195
2120
+ - type: euclidean_spearman
2121
+ value: 86.2116149319238
2122
+ - type: manhattan_pearson
2123
+ value: 86.07879456717032
2124
+ - type: manhattan_spearman
2125
+ value: 86.2022069635119
2126
+ - task:
2127
+ type: Reranking
2128
+ dataset:
2129
+ type: mteb/scidocs-reranking
2130
+ name: MTEB SciDocsRR
2131
+ config: default
2132
+ split: test
2133
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
2134
+ metrics:
2135
+ - type: map
2136
+ value: 79.75976311752326
2137
+ - type: mrr
2138
+ value: 94.15782837351466
2139
+ - task:
2140
+ type: Retrieval
2141
+ dataset:
2142
+ type: scifact
2143
+ name: MTEB SciFact
2144
+ config: default
2145
+ split: test
2146
+ revision: None
2147
+ metrics:
2148
+ - type: map_at_1
2149
+ value: 51.193999999999996
2150
+ - type: map_at_10
2151
+ value: 61.224999999999994
2152
+ - type: map_at_100
2153
+ value: 62.031000000000006
2154
+ - type: map_at_1000
2155
+ value: 62.066
2156
+ - type: map_at_3
2157
+ value: 59.269000000000005
2158
+ - type: map_at_5
2159
+ value: 60.159
2160
+ - type: mrr_at_1
2161
+ value: 53.667
2162
+ - type: mrr_at_10
2163
+ value: 62.74999999999999
2164
+ - type: mrr_at_100
2165
+ value: 63.39399999999999
2166
+ - type: mrr_at_1000
2167
+ value: 63.425
2168
+ - type: mrr_at_3
2169
+ value: 61.389
2170
+ - type: mrr_at_5
2171
+ value: 61.989000000000004
2172
+ - type: ndcg_at_1
2173
+ value: 53.667
2174
+ - type: ndcg_at_10
2175
+ value: 65.596
2176
+ - type: ndcg_at_100
2177
+ value: 68.906
2178
+ - type: ndcg_at_1000
2179
+ value: 69.78999999999999
2180
+ - type: ndcg_at_3
2181
+ value: 62.261
2182
+ - type: ndcg_at_5
2183
+ value: 63.453
2184
+ - type: precision_at_1
2185
+ value: 53.667
2186
+ - type: precision_at_10
2187
+ value: 8.667
2188
+ - type: precision_at_100
2189
+ value: 1.04
2190
+ - type: precision_at_1000
2191
+ value: 0.11100000000000002
2192
+ - type: precision_at_3
2193
+ value: 24.556
2194
+ - type: precision_at_5
2195
+ value: 15.6
2196
+ - type: recall_at_1
2197
+ value: 51.193999999999996
2198
+ - type: recall_at_10
2199
+ value: 77.156
2200
+ - type: recall_at_100
2201
+ value: 91.43299999999999
2202
+ - type: recall_at_1000
2203
+ value: 98.333
2204
+ - type: recall_at_3
2205
+ value: 67.994
2206
+ - type: recall_at_5
2207
+ value: 71.14399999999999
2208
+ - task:
2209
+ type: PairClassification
2210
+ dataset:
2211
+ type: mteb/sprintduplicatequestions-pairclassification
2212
+ name: MTEB SprintDuplicateQuestions
2213
+ config: default
2214
+ split: test
2215
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
2216
+ metrics:
2217
+ - type: cos_sim_accuracy
2218
+ value: 99.81485148514851
2219
+ - type: cos_sim_ap
2220
+ value: 95.28896513388551
2221
+ - type: cos_sim_f1
2222
+ value: 90.43478260869566
2223
+ - type: cos_sim_precision
2224
+ value: 92.56544502617801
2225
+ - type: cos_sim_recall
2226
+ value: 88.4
2227
+ - type: dot_accuracy
2228
+ value: 99.30594059405941
2229
+ - type: dot_ap
2230
+ value: 61.6432597455472
2231
+ - type: dot_f1
2232
+ value: 59.46481665014866
2233
+ - type: dot_precision
2234
+ value: 58.93909626719057
2235
+ - type: dot_recall
2236
+ value: 60
2237
+ - type: euclidean_accuracy
2238
+ value: 99.81980198019802
2239
+ - type: euclidean_ap
2240
+ value: 95.21411049527
2241
+ - type: euclidean_f1
2242
+ value: 91.06090373280944
2243
+ - type: euclidean_precision
2244
+ value: 89.47876447876449
2245
+ - type: euclidean_recall
2246
+ value: 92.7
2247
+ - type: manhattan_accuracy
2248
+ value: 99.81782178217821
2249
+ - type: manhattan_ap
2250
+ value: 95.32449994414968
2251
+ - type: manhattan_f1
2252
+ value: 90.86395233366436
2253
+ - type: manhattan_precision
2254
+ value: 90.23668639053254
2255
+ - type: manhattan_recall
2256
+ value: 91.5
2257
+ - type: max_accuracy
2258
+ value: 99.81980198019802
2259
+ - type: max_ap
2260
+ value: 95.32449994414968
2261
+ - type: max_f1
2262
+ value: 91.06090373280944
2263
+ - task:
2264
+ type: Clustering
2265
+ dataset:
2266
+ type: mteb/stackexchange-clustering
2267
+ name: MTEB StackExchangeClustering
2268
+ config: default
2269
+ split: test
2270
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
2271
+ metrics:
2272
+ - type: v_measure
2273
+ value: 59.08045614613064
2274
+ - task:
2275
+ type: Clustering
2276
+ dataset:
2277
+ type: mteb/stackexchange-clustering-p2p
2278
+ name: MTEB StackExchangeClusteringP2P
2279
+ config: default
2280
+ split: test
2281
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
2282
+ metrics:
2283
+ - type: v_measure
2284
+ value: 30.297802606804748
2285
+ - task:
2286
+ type: Reranking
2287
+ dataset:
2288
+ type: mteb/stackoverflowdupquestions-reranking
2289
+ name: MTEB StackOverflowDupQuestions
2290
+ config: default
2291
+ split: test
2292
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
2293
+ metrics:
2294
+ - type: map
2295
+ value: 49.12801740706292
2296
+ - type: mrr
2297
+ value: 50.05592956879722
2298
+ - task:
2299
+ type: Summarization
2300
+ dataset:
2301
+ type: mteb/summeval
2302
+ name: MTEB SummEval
2303
+ config: default
2304
+ split: test
2305
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
2306
+ metrics:
2307
+ - type: cos_sim_pearson
2308
+ value: 31.523347880124497
2309
+ - type: cos_sim_spearman
2310
+ value: 31.388214436391014
2311
+ - type: dot_pearson
2312
+ value: 24.55403435439901
2313
+ - type: dot_spearman
2314
+ value: 23.50153210841191
2315
+ - task:
2316
+ type: Retrieval
2317
+ dataset:
2318
+ type: trec-covid
2319
+ name: MTEB TRECCOVID
2320
+ config: default
2321
+ split: test
2322
+ revision: None
2323
+ metrics:
2324
+ - type: map_at_1
2325
+ value: 0.243
2326
+ - type: map_at_10
2327
+ value: 1.886
2328
+ - type: map_at_100
2329
+ value: 10.040000000000001
2330
+ - type: map_at_1000
2331
+ value: 23.768
2332
+ - type: map_at_3
2333
+ value: 0.674
2334
+ - type: map_at_5
2335
+ value: 1.079
2336
+ - type: mrr_at_1
2337
+ value: 88
2338
+ - type: mrr_at_10
2339
+ value: 93.667
2340
+ - type: mrr_at_100
2341
+ value: 93.667
2342
+ - type: mrr_at_1000
2343
+ value: 93.667
2344
+ - type: mrr_at_3
2345
+ value: 93.667
2346
+ - type: mrr_at_5
2347
+ value: 93.667
2348
+ - type: ndcg_at_1
2349
+ value: 83
2350
+ - type: ndcg_at_10
2351
+ value: 76.777
2352
+ - type: ndcg_at_100
2353
+ value: 55.153
2354
+ - type: ndcg_at_1000
2355
+ value: 47.912
2356
+ - type: ndcg_at_3
2357
+ value: 81.358
2358
+ - type: ndcg_at_5
2359
+ value: 80.74799999999999
2360
+ - type: precision_at_1
2361
+ value: 88
2362
+ - type: precision_at_10
2363
+ value: 80.80000000000001
2364
+ - type: precision_at_100
2365
+ value: 56.02
2366
+ - type: precision_at_1000
2367
+ value: 21.51
2368
+ - type: precision_at_3
2369
+ value: 86
2370
+ - type: precision_at_5
2371
+ value: 86
2372
+ - type: recall_at_1
2373
+ value: 0.243
2374
+ - type: recall_at_10
2375
+ value: 2.0869999999999997
2376
+ - type: recall_at_100
2377
+ value: 13.014000000000001
2378
+ - type: recall_at_1000
2379
+ value: 44.433
2380
+ - type: recall_at_3
2381
+ value: 0.6910000000000001
2382
+ - type: recall_at_5
2383
+ value: 1.1440000000000001
2384
+ - task:
2385
+ type: Retrieval
2386
+ dataset:
2387
+ type: webis-touche2020
2388
+ name: MTEB Touche2020
2389
+ config: default
2390
+ split: test
2391
+ revision: None
2392
+ metrics:
2393
+ - type: map_at_1
2394
+ value: 3.066
2395
+ - type: map_at_10
2396
+ value: 10.615
2397
+ - type: map_at_100
2398
+ value: 16.463
2399
+ - type: map_at_1000
2400
+ value: 17.815
2401
+ - type: map_at_3
2402
+ value: 5.7860000000000005
2403
+ - type: map_at_5
2404
+ value: 7.353999999999999
2405
+ - type: mrr_at_1
2406
+ value: 38.775999999999996
2407
+ - type: mrr_at_10
2408
+ value: 53.846000000000004
2409
+ - type: mrr_at_100
2410
+ value: 54.37
2411
+ - type: mrr_at_1000
2412
+ value: 54.37
2413
+ - type: mrr_at_3
2414
+ value: 48.980000000000004
2415
+ - type: mrr_at_5
2416
+ value: 51.735
2417
+ - type: ndcg_at_1
2418
+ value: 34.694
2419
+ - type: ndcg_at_10
2420
+ value: 26.811
2421
+ - type: ndcg_at_100
2422
+ value: 37.342999999999996
2423
+ - type: ndcg_at_1000
2424
+ value: 47.964
2425
+ - type: ndcg_at_3
2426
+ value: 30.906
2427
+ - type: ndcg_at_5
2428
+ value: 27.77
2429
+ - type: precision_at_1
2430
+ value: 38.775999999999996
2431
+ - type: precision_at_10
2432
+ value: 23.878
2433
+ - type: precision_at_100
2434
+ value: 7.632999999999999
2435
+ - type: precision_at_1000
2436
+ value: 1.469
2437
+ - type: precision_at_3
2438
+ value: 31.973000000000003
2439
+ - type: precision_at_5
2440
+ value: 26.939
2441
+ - type: recall_at_1
2442
+ value: 3.066
2443
+ - type: recall_at_10
2444
+ value: 17.112
2445
+ - type: recall_at_100
2446
+ value: 47.723
2447
+ - type: recall_at_1000
2448
+ value: 79.50500000000001
2449
+ - type: recall_at_3
2450
+ value: 6.825
2451
+ - type: recall_at_5
2452
+ value: 9.584
2453
+ - task:
2454
+ type: Classification
2455
+ dataset:
2456
+ type: mteb/toxic_conversations_50k
2457
+ name: MTEB ToxicConversationsClassification
2458
+ config: default
2459
+ split: test
2460
+ revision: d7c0de2777da35d6aae2200a62c6e0e5af397c4c
2461
+ metrics:
2462
+ - type: accuracy
2463
+ value: 72.76460000000002
2464
+ - type: ap
2465
+ value: 14.944240012137053
2466
+ - type: f1
2467
+ value: 55.89805777266571
2468
+ - task:
2469
+ type: Classification
2470
+ dataset:
2471
+ type: mteb/tweet_sentiment_extraction
2472
+ name: MTEB TweetSentimentExtractionClassification
2473
+ config: default
2474
+ split: test
2475
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
2476
+ metrics:
2477
+ - type: accuracy
2478
+ value: 63.30503678551217
2479
+ - type: f1
2480
+ value: 63.57492701921179
2481
+ - task:
2482
+ type: Clustering
2483
+ dataset:
2484
+ type: mteb/twentynewsgroups-clustering
2485
+ name: MTEB TwentyNewsgroupsClustering
2486
+ config: default
2487
+ split: test
2488
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
2489
+ metrics:
2490
+ - type: v_measure
2491
+ value: 37.51066495006874
2492
+ - task:
2493
+ type: PairClassification
2494
+ dataset:
2495
+ type: mteb/twittersemeval2015-pairclassification
2496
+ name: MTEB TwitterSemEval2015
2497
+ config: default
2498
+ split: test
2499
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
2500
+ metrics:
2501
+ - type: cos_sim_accuracy
2502
+ value: 86.07021517553794
2503
+ - type: cos_sim_ap
2504
+ value: 74.15520712370555
2505
+ - type: cos_sim_f1
2506
+ value: 68.64321608040201
2507
+ - type: cos_sim_precision
2508
+ value: 65.51558752997602
2509
+ - type: cos_sim_recall
2510
+ value: 72.0844327176781
2511
+ - type: dot_accuracy
2512
+ value: 80.23484532395541
2513
+ - type: dot_ap
2514
+ value: 54.298763810214176
2515
+ - type: dot_f1
2516
+ value: 53.22254659779924
2517
+ - type: dot_precision
2518
+ value: 46.32525410476936
2519
+ - type: dot_recall
2520
+ value: 62.532981530343015
2521
+ - type: euclidean_accuracy
2522
+ value: 86.04637301066937
2523
+ - type: euclidean_ap
2524
+ value: 73.85333854233123
2525
+ - type: euclidean_f1
2526
+ value: 68.77723660599845
2527
+ - type: euclidean_precision
2528
+ value: 66.87437686939182
2529
+ - type: euclidean_recall
2530
+ value: 70.79155672823218
2531
+ - type: manhattan_accuracy
2532
+ value: 85.98676759849795
2533
+ - type: manhattan_ap
2534
+ value: 73.56016090035973
2535
+ - type: manhattan_f1
2536
+ value: 68.48878539036647
2537
+ - type: manhattan_precision
2538
+ value: 63.9505607690547
2539
+ - type: manhattan_recall
2540
+ value: 73.7203166226913
2541
+ - type: max_accuracy
2542
+ value: 86.07021517553794
2543
+ - type: max_ap
2544
+ value: 74.15520712370555
2545
+ - type: max_f1
2546
+ value: 68.77723660599845
2547
+ - task:
2548
+ type: PairClassification
2549
+ dataset:
2550
+ type: mteb/twitterurlcorpus-pairclassification
2551
+ name: MTEB TwitterURLCorpus
2552
+ config: default
2553
+ split: test
2554
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
2555
+ metrics:
2556
+ - type: cos_sim_accuracy
2557
+ value: 88.92769821865176
2558
+ - type: cos_sim_ap
2559
+ value: 85.78879502899773
2560
+ - type: cos_sim_f1
2561
+ value: 78.14414083990464
2562
+ - type: cos_sim_precision
2563
+ value: 74.61651607480563
2564
+ - type: cos_sim_recall
2565
+ value: 82.0218663381583
2566
+ - type: dot_accuracy
2567
+ value: 84.95750378390964
2568
+ - type: dot_ap
2569
+ value: 75.80219641857563
2570
+ - type: dot_f1
2571
+ value: 70.13966179585681
2572
+ - type: dot_precision
2573
+ value: 65.71140262361251
2574
+ - type: dot_recall
2575
+ value: 75.20788420080073
2576
+ - type: euclidean_accuracy
2577
+ value: 88.93546008460433
2578
+ - type: euclidean_ap
2579
+ value: 85.72056428301667
2580
+ - type: euclidean_f1
2581
+ value: 78.14387902598124
2582
+ - type: euclidean_precision
2583
+ value: 75.3376688344172
2584
+ - type: euclidean_recall
2585
+ value: 81.16723129042192
2586
+ - type: manhattan_accuracy
2587
+ value: 88.96262661543835
2588
+ - type: manhattan_ap
2589
+ value: 85.76605136314335
2590
+ - type: manhattan_f1
2591
+ value: 78.26696165191743
2592
+ - type: manhattan_precision
2593
+ value: 75.0990659496179
2594
+ - type: manhattan_recall
2595
+ value: 81.71388974437943
2596
+ - type: max_accuracy
2597
+ value: 88.96262661543835
2598
+ - type: max_ap
2599
+ value: 85.78879502899773
2600
+ - type: max_f1
2601
+ value: 78.26696165191743
2602
+ language:
2603
+ - en
2604
+ license: mit
2605
+ ---
2606
+ # # Fast-Inference with Ctranslate2
2607
+ Speedup inference while reducing memory by 2x-4x using int8 inference in C++ on CPU or GPU.
2608
+
2609
+ quantized version of [intfloat/e5-small](https://huggingface.co/intfloat/e5-small)
2610
+ ```bash
2611
+ pip install hf-hub-ctranslate2>=2.10.0 ctranslate2>=3.16.0
2612
+ ```
2613
+
2614
+ ```python
2615
+ # from transformers import AutoTokenizer
2616
+ model_name = "michaelfeil/ct2fast-e5-small"
2617
+
2618
+ from hf_hub_ctranslate2 import CT2SentenceTransformer
2619
+ model = CT2SentenceTransformer(
2620
+ model_name, compute_type="int8_float16", device="cuda"
2621
+ )
2622
+ embeddings = model.encode(
2623
+ ["I like soccer", "I like tennis", "The eiffel tower is in Paris"],
2624
+ batch_size=32,
2625
+ convert_to_numpy=True,
2626
+ normalize_embeddings=True,
2627
+ )
2628
+ print(embeddings.shape, embeddings)
2629
+ scores = (embeddings @ embeddings.T) * 100
2630
+
2631
+ ```
2632
+
2633
+ Checkpoint compatible to [ctranslate2>=3.16.0](https://github.com/OpenNMT/CTranslate2)
2634
+ and [hf-hub-ctranslate2>=2.10.0](https://github.com/michaelfeil/hf-hub-ctranslate2)
2635
+ - `compute_type=int8_float16` for `device="cuda"`
2636
+ - `compute_type=int8` for `device="cpu"`
2637
+
2638
+ Converted on 2023-06-18 using
2639
+ ```
2640
+ ct2-transformers-converter --model intfloat/e5-small --output_dir ~/tmp-ct2fast-e5-small --force --copy_files tokenizer.json sentence_bert_config.json README.md modules.json special_tokens_map.json vocab.txt tokenizer_config.json .gitattributes --trust_remote_code
2641
+ ```
2642
+
2643
+ # Licence and other remarks:
2644
+ This is just a quantized version. Licence conditions are intended to be idential to original huggingface repo.
2645
+
2646
+ # Original description
2647
+
2648
+
2649
+ # E5-small
2650
+
2651
+ [Text Embeddings by Weakly-Supervised Contrastive Pre-training](https://arxiv.org/pdf/2212.03533.pdf).
2652
+ Liang Wang, Nan Yang, Xiaolong Huang, Binxing Jiao, Linjun Yang, Daxin Jiang, Rangan Majumder, Furu Wei, arXiv 2022
2653
+
2654
+ This model has 12 layers and the embedding size is 384.
2655
+
2656
+ ## Usage
2657
+
2658
+ Below is an example to encode queries and passages from the MS-MARCO passage ranking dataset.
2659
+
2660
+ ```python
2661
+ import torch.nn.functional as F
2662
+
2663
+ from torch import Tensor
2664
+ from transformers import AutoTokenizer, AutoModel
2665
+
2666
+
2667
+ def average_pool(last_hidden_states: Tensor,
2668
+ attention_mask: Tensor) -> Tensor:
2669
+ last_hidden = last_hidden_states.masked_fill(~attention_mask[..., None].bool(), 0.0)
2670
+ return last_hidden.sum(dim=1) / attention_mask.sum(dim=1)[..., None]
2671
+
2672
+
2673
+ # Each input text should start with "query: " or "passage: ".
2674
+ # For tasks other than retrieval, you can simply use the "query: " prefix.
2675
+ input_texts = ['query: how much protein should a female eat',
2676
+ 'query: summit define',
2677
+ "passage: As a general guideline, the CDC's average requirement of protein for women ages 19 to 70 is 46 grams per day. But, as you can see from this chart, you'll need to increase that if you're expecting or training for a marathon. Check out the chart below to see how much protein you should be eating each day.",
2678
+ "passage: Definition of summit for English Language Learners. : 1 the highest point of a mountain : the top of a mountain. : 2 the highest level. : 3 a meeting or series of meetings between the leaders of two or more governments."]
2679
+
2680
+ tokenizer = AutoTokenizer.from_pretrained('intfloat/e5-small')
2681
+ model = AutoModel.from_pretrained('intfloat/e5-small')
2682
+
2683
+ # Tokenize the input texts
2684
+ batch_dict = tokenizer(input_texts, max_length=512, padding=True, truncation=True, return_tensors='pt')
2685
+
2686
+ outputs = model(**batch_dict)
2687
+ embeddings = average_pool(outputs.last_hidden_state, batch_dict['attention_mask'])
2688
+
2689
+ # (Optionally) normalize embeddings
2690
+ embeddings = F.normalize(embeddings, p=2, dim=1)
2691
+ scores = (embeddings[:2] @ embeddings[2:].T) * 100
2692
+ print(scores.tolist())
2693
+ ```
2694
+
2695
+ ## Training Details
2696
+
2697
+ Please refer to our paper at [https://arxiv.org/pdf/2212.03533.pdf](https://arxiv.org/pdf/2212.03533.pdf).
2698
+
2699
+ ## Benchmark Evaluation
2700
+
2701
+ Check out [unilm/e5](https://github.com/microsoft/unilm/tree/master/e5) to reproduce evaluation results
2702
+ on the [BEIR](https://arxiv.org/abs/2104.08663) and [MTEB benchmark](https://arxiv.org/abs/2210.07316).
2703
+
2704
+ ## Citation
2705
+
2706
+ If you find our paper or models helpful, please consider cite as follows:
2707
+
2708
+ ```
2709
+ @article{wang2022text,
2710
+ title={Text Embeddings by Weakly-Supervised Contrastive Pre-training},
2711
+ author={Wang, Liang and Yang, Nan and Huang, Xiaolong and Jiao, Binxing and Yang, Linjun and Jiang, Daxin and Majumder, Rangan and Wei, Furu},
2712
+ journal={arXiv preprint arXiv:2212.03533},
2713
+ year={2022}
2714
+ }
2715
+ ```
2716
+
2717
+ ## Limitations
2718
+
2719
+ This model only works for English texts. Long texts will be truncated to at most 512 tokens.
config.json ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": "<s>",
3
+ "eos_token": "</s>",
4
+ "layer_norm_epsilon": 1e-12,
5
+ "unk_token": "[UNK]"
6
+ }
model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:738ee67946d5969622bf1777a455b1b6eef99d62ce859cd781e1633bcc0b4260
3
+ size 133448364
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 512,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]"}
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1 @@
 
 
1
+ {"do_lower_case": true, "unk_token": "[UNK]", "sep_token": "[SEP]", "pad_token": "[PAD]", "cls_token": "[CLS]", "mask_token": "[MASK]", "tokenize_chinese_chars": true, "strip_accents": null, "special_tokens_map_file": null, "name_or_path": "amlt/1109_tnlrv3_bs32k_ft/all_kd_ft", "do_basic_tokenize": true, "never_split": null, "tokenizer_class": "BertTokenizer"}
vocab.txt ADDED
The diff for this file is too large to render. See raw diff
 
vocabulary.json ADDED
The diff for this file is too large to render. See raw diff
 
vocabulary.txt ADDED
The diff for this file is too large to render. See raw diff