Muennighoff commited on
Commit
90780e8
1 Parent(s): f0dd127

Add MTEB eval results

Browse files
Files changed (1) hide show
  1. README.md +3843 -0
README.md CHANGED
@@ -4,6 +4,3849 @@ tags:
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
7
  ---
8
 
9
  # SGPT-125M-weightedmean-nli-bitfit
 
4
  - sentence-transformers
5
  - feature-extraction
6
  - sentence-similarity
7
+ model-index:
8
+ - name: SGPT-125M-weightedmean-nli-bitfit
9
+ results:
10
+ - task:
11
+ type: Clustering
12
+ dataset:
13
+ type: mteb/medrxiv-clustering-p2p
14
+ name: MTEB MedrxivClusteringP2P
15
+ metrics:
16
+ - type: v_measure
17
+ value: 0.28301902023313874
18
+ - task:
19
+ type: STS
20
+ dataset:
21
+ type: mteb/sts13-sts
22
+ name: MTEB STS13
23
+ metrics:
24
+ - type: cos_sim_pearson
25
+ value: 0.76401935081936
26
+ - type: cos_sim_spearman
27
+ value: 0.7723446219694267
28
+ - type: euclidean_pearson
29
+ value: 0.7461017160439877
30
+ - type: euclidean_spearman
31
+ value: 0.7585871531365609
32
+ - type: manhattan_pearson
33
+ value: 0.7483034779539725
34
+ - type: manhattan_spearman
35
+ value: 0.759594899358843
36
+ - task:
37
+ type: Clustering
38
+ dataset:
39
+ type: mteb/arxiv-clustering-p2p
40
+ name: MTEB ArxivClusteringP2P
41
+ metrics:
42
+ - type: v_measure
43
+ value: 0.3474248247787077
44
+ - task:
45
+ type: Classification
46
+ dataset:
47
+ type: mteb/amazon_reviews_multi
48
+ name: MTEB AmazonReviewsClassification (en)
49
+ metrics:
50
+ - type: accuracy
51
+ value: 0.35098
52
+ - type: f1
53
+ value: 0.34732656514357263
54
+ - task:
55
+ type: Classification
56
+ dataset:
57
+ type: mteb/amazon_reviews_multi
58
+ name: MTEB AmazonReviewsClassification (de)
59
+ metrics:
60
+ - type: accuracy
61
+ value: 0.24516
62
+ - type: f1
63
+ value: 0.2421748200448397
64
+ - task:
65
+ type: Classification
66
+ dataset:
67
+ type: mteb/amazon_reviews_multi
68
+ name: MTEB AmazonReviewsClassification (es)
69
+ metrics:
70
+ - type: accuracy
71
+ value: 0.29097999999999996
72
+ - type: f1
73
+ value: 0.28620040162757093
74
+ - task:
75
+ type: Classification
76
+ dataset:
77
+ type: mteb/amazon_reviews_multi
78
+ name: MTEB AmazonReviewsClassification (fr)
79
+ metrics:
80
+ - type: accuracy
81
+ value: 0.27396
82
+ - type: f1
83
+ value: 0.27146888644986283
84
+ - task:
85
+ type: Classification
86
+ dataset:
87
+ type: mteb/amazon_reviews_multi
88
+ name: MTEB AmazonReviewsClassification (ja)
89
+ metrics:
90
+ - type: accuracy
91
+ value: 0.21724000000000002
92
+ - type: f1
93
+ value: 0.2137230564276654
94
+ - task:
95
+ type: Classification
96
+ dataset:
97
+ type: mteb/amazon_reviews_multi
98
+ name: MTEB AmazonReviewsClassification (zh)
99
+ metrics:
100
+ - type: accuracy
101
+ value: 0.23975999999999997
102
+ - type: f1
103
+ value: 0.23741137981755484
104
+ - task:
105
+ type: BitextMining
106
+ dataset:
107
+ type: mteb/bucc-bitext-mining
108
+ name: MTEB BUCC (de-en)
109
+ metrics:
110
+ - type: accuracy
111
+ value: 0.010960334029227558
112
+ - type: f1
113
+ value: 0.01092553931802366
114
+ - type: precision
115
+ value: 0.010908141962421711
116
+ - type: recall
117
+ value: 0.010960334029227558
118
+ - task:
119
+ type: BitextMining
120
+ dataset:
121
+ type: mteb/bucc-bitext-mining
122
+ name: MTEB BUCC (fr-en)
123
+ metrics:
124
+ - type: accuracy
125
+ value: 0.00022011886418666079
126
+ - type: f1
127
+ value: 0.00022011886418666079
128
+ - type: precision
129
+ value: 0.00022011886418666079
130
+ - type: recall
131
+ value: 0.00022011886418666079
132
+ - task:
133
+ type: BitextMining
134
+ dataset:
135
+ type: mteb/bucc-bitext-mining
136
+ name: MTEB BUCC (ru-en)
137
+ metrics:
138
+ - type: accuracy
139
+ value: 0.0
140
+ - type: f1
141
+ value: 0.0
142
+ - type: precision
143
+ value: 0.0
144
+ - type: recall
145
+ value: 0.0
146
+ - task:
147
+ type: BitextMining
148
+ dataset:
149
+ type: mteb/bucc-bitext-mining
150
+ name: MTEB BUCC (zh-en)
151
+ metrics:
152
+ - type: accuracy
153
+ value: 0.0
154
+ - type: f1
155
+ value: 0.0
156
+ - type: precision
157
+ value: 0.0
158
+ - type: recall
159
+ value: 0.0
160
+ - task:
161
+ type: Classification
162
+ dataset:
163
+ type: mteb/mtop_domain
164
+ name: MTEB MTOPDomainClassification (en)
165
+ metrics:
166
+ - type: accuracy
167
+ value: 0.8151846785225718
168
+ - type: f1
169
+ value: 0.81648869152345
170
+ - task:
171
+ type: Classification
172
+ dataset:
173
+ type: mteb/mtop_domain
174
+ name: MTEB MTOPDomainClassification (de)
175
+ metrics:
176
+ - type: accuracy
177
+ value: 0.6037475345167653
178
+ - type: f1
179
+ value: 0.5845264937551703
180
+ - task:
181
+ type: Classification
182
+ dataset:
183
+ type: mteb/mtop_domain
184
+ name: MTEB MTOPDomainClassification (es)
185
+ metrics:
186
+ - type: accuracy
187
+ value: 0.6736824549699799
188
+ - type: f1
189
+ value: 0.6535927434998515
190
+ - task:
191
+ type: Classification
192
+ dataset:
193
+ type: mteb/mtop_domain
194
+ name: MTEB MTOPDomainClassification (fr)
195
+ metrics:
196
+ - type: accuracy
197
+ value: 0.6312871907297212
198
+ - type: f1
199
+ value: 0.6137620329272278
200
+ - task:
201
+ type: Classification
202
+ dataset:
203
+ type: mteb/mtop_domain
204
+ name: MTEB MTOPDomainClassification (hi)
205
+ metrics:
206
+ - type: accuracy
207
+ value: 0.47045536034420943
208
+ - type: f1
209
+ value: 0.46203899126445613
210
+ - task:
211
+ type: Classification
212
+ dataset:
213
+ type: mteb/mtop_domain
214
+ name: MTEB MTOPDomainClassification (th)
215
+ metrics:
216
+ - type: accuracy
217
+ value: 0.5228209764918625
218
+ - type: f1
219
+ value: 0.5075489206473579
220
+ - task:
221
+ type: Retrieval
222
+ dataset:
223
+ type: BeIR/cqadupstack
224
+ name: MTEB CQADupstackMathematicaRetrieval
225
+ metrics:
226
+ - type: map_at_1
227
+ value: 0.0808
228
+ - type: map_at_10
229
+ value: 0.11691
230
+ - type: map_at_100
231
+ value: 0.12312
232
+ - type: map_at_1000
233
+ value: 0.12439
234
+ - type: map_at_3
235
+ value: 0.10344
236
+ - type: map_at_5
237
+ value: 0.10996
238
+ - type: ndcg_at_1
239
+ value: 0.10697
240
+ - type: ndcg_at_10
241
+ value: 0.1448
242
+ - type: ndcg_at_100
243
+ value: 0.18161
244
+ - type: ndcg_at_1000
245
+ value: 0.21886
246
+ - type: ndcg_at_3
247
+ value: 0.11872
248
+ - type: ndcg_at_5
249
+ value: 0.12834
250
+ - type: precision_at_1
251
+ value: 0.10697
252
+ - type: precision_at_10
253
+ value: 0.02811
254
+ - type: precision_at_100
255
+ value: 0.00551
256
+ - type: precision_at_1000
257
+ value: 0.00102
258
+ - type: precision_at_3
259
+ value: 0.05804
260
+ - type: precision_at_5
261
+ value: 0.04154
262
+ - type: recall_at_1
263
+ value: 0.0808
264
+ - type: recall_at_10
265
+ value: 0.20235
266
+ - type: recall_at_100
267
+ value: 0.37526
268
+ - type: recall_at_1000
269
+ value: 0.65106
270
+ - type: recall_at_3
271
+ value: 0.12804
272
+ - type: recall_at_5
273
+ value: 0.15499
274
+ - task:
275
+ type: Classification
276
+ dataset:
277
+ type: mteb/amazon_counterfactual
278
+ name: MTEB AmazonCounterfactualClassification (en)
279
+ metrics:
280
+ - type: accuracy
281
+ value: 0.6588059701492537
282
+ - type: ap
283
+ value: 0.28685493163579784
284
+ - type: f1
285
+ value: 0.5979951005816335
286
+ - task:
287
+ type: Classification
288
+ dataset:
289
+ type: mteb/amazon_counterfactual
290
+ name: MTEB AmazonCounterfactualClassification (de)
291
+ metrics:
292
+ - type: accuracy
293
+ value: 0.5907922912205568
294
+ - type: ap
295
+ value: 0.7391887421019034
296
+ - type: f1
297
+ value: 0.566316368658711
298
+ - task:
299
+ type: Classification
300
+ dataset:
301
+ type: mteb/amazon_counterfactual
302
+ name: MTEB AmazonCounterfactualClassification (en-ext)
303
+ metrics:
304
+ - type: accuracy
305
+ value: 0.6491754122938531
306
+ - type: ap
307
+ value: 0.16360681214864226
308
+ - type: f1
309
+ value: 0.5312659206152377
310
+ - task:
311
+ type: Classification
312
+ dataset:
313
+ type: mteb/amazon_counterfactual
314
+ name: MTEB AmazonCounterfactualClassification (ja)
315
+ metrics:
316
+ - type: accuracy
317
+ value: 0.56423982869379
318
+ - type: ap
319
+ value: 0.12143003571907898
320
+ - type: f1
321
+ value: 0.45763637779874716
322
+ - task:
323
+ type: Retrieval
324
+ dataset:
325
+ type: BeIR/cqadupstack
326
+ name: MTEB CQADupstackTexRetrieval
327
+ metrics:
328
+ - type: map_at_1
329
+ value: 0.06496
330
+ - type: map_at_10
331
+ value: 0.09243
332
+ - type: map_at_100
333
+ value: 0.09841
334
+ - type: map_at_1000
335
+ value: 0.09946
336
+ - type: map_at_3
337
+ value: 0.08395
338
+ - type: map_at_5
339
+ value: 0.08872
340
+ - type: ndcg_at_1
341
+ value: 0.08224
342
+ - type: ndcg_at_10
343
+ value: 0.1124
344
+ - type: ndcg_at_100
345
+ value: 0.14525
346
+ - type: ndcg_at_1000
347
+ value: 0.17686
348
+ - type: ndcg_at_3
349
+ value: 0.09617
350
+ - type: ndcg_at_5
351
+ value: 0.1037
352
+ - type: precision_at_1
353
+ value: 0.08224
354
+ - type: precision_at_10
355
+ value: 0.02082
356
+ - type: precision_at_100
357
+ value: 0.00443
358
+ - type: precision_at_1000
359
+ value: 0.00085
360
+ - type: precision_at_3
361
+ value: 0.04623
362
+ - type: precision_at_5
363
+ value: 0.03331
364
+ - type: recall_at_1
365
+ value: 0.06496
366
+ - type: recall_at_10
367
+ value: 0.1531
368
+ - type: recall_at_100
369
+ value: 0.3068
370
+ - type: recall_at_1000
371
+ value: 0.54335
372
+ - type: recall_at_3
373
+ value: 0.10691
374
+ - type: recall_at_5
375
+ value: 0.12688
376
+ - task:
377
+ type: Reranking
378
+ dataset:
379
+ type: mteb/mind_small
380
+ name: MTEB MindSmallReranking
381
+ metrics:
382
+ - type: map
383
+ value: 0.2926934104146833
384
+ - type: mrr
385
+ value: 0.3013214087687572
386
+ - task:
387
+ type: Retrieval
388
+ dataset:
389
+ type: nfcorpus
390
+ name: MTEB NFCorpus
391
+ metrics:
392
+ - type: map_at_1
393
+ value: 0.01227
394
+ - type: map_at_10
395
+ value: 0.03081
396
+ - type: map_at_100
397
+ value: 0.04104
398
+ - type: map_at_1000
399
+ value: 0.04989
400
+ - type: map_at_3
401
+ value: 0.02221
402
+ - type: map_at_5
403
+ value: 0.02535
404
+ - type: ndcg_at_1
405
+ value: 0.15015
406
+ - type: ndcg_at_10
407
+ value: 0.11805
408
+ - type: ndcg_at_100
409
+ value: 0.12452
410
+ - type: ndcg_at_1000
411
+ value: 0.22284
412
+ - type: ndcg_at_3
413
+ value: 0.13257
414
+ - type: ndcg_at_5
415
+ value: 0.12199
416
+ - type: precision_at_1
417
+ value: 0.16409
418
+ - type: precision_at_10
419
+ value: 0.09102
420
+ - type: precision_at_100
421
+ value: 0.03678
422
+ - type: precision_at_1000
423
+ value: 0.01609
424
+ - type: precision_at_3
425
+ value: 0.12797
426
+ - type: precision_at_5
427
+ value: 0.10464
428
+ - type: recall_at_1
429
+ value: 0.01227
430
+ - type: recall_at_10
431
+ value: 0.05838
432
+ - type: recall_at_100
433
+ value: 0.15716
434
+ - type: recall_at_1000
435
+ value: 0.48837
436
+ - type: recall_at_3
437
+ value: 0.02828
438
+ - type: recall_at_5
439
+ value: 0.03697
440
+ - task:
441
+ type: Retrieval
442
+ dataset:
443
+ type: msmarco
444
+ name: MTEB MSMARCO
445
+ metrics:
446
+ - type: map_at_1
447
+ value: 0.0288
448
+ - type: map_at_10
449
+ value: 0.04914
450
+ - type: map_at_100
451
+ value: 0.05459
452
+ - type: map_at_1000
453
+ value: 0.05538
454
+ - type: map_at_3
455
+ value: 0.04087
456
+ - type: map_at_5
457
+ value: 0.04518
458
+ - type: ndcg_at_1
459
+ value: 0.02937
460
+ - type: ndcg_at_10
461
+ value: 0.06273
462
+ - type: ndcg_at_100
463
+ value: 0.09426
464
+ - type: ndcg_at_1000
465
+ value: 0.12033
466
+ - type: ndcg_at_3
467
+ value: 0.04513
468
+ - type: ndcg_at_5
469
+ value: 0.05292
470
+ - type: precision_at_1
471
+ value: 0.02937
472
+ - type: precision_at_10
473
+ value: 0.01089
474
+ - type: precision_at_100
475
+ value: 0.00277
476
+ - type: precision_at_1000
477
+ value: 0.00051
478
+ - type: precision_at_3
479
+ value: 0.01929
480
+ - type: precision_at_5
481
+ value: 0.01547
482
+ - type: recall_at_1
483
+ value: 0.0288
484
+ - type: recall_at_10
485
+ value: 0.10578
486
+ - type: recall_at_100
487
+ value: 0.26267
488
+ - type: recall_at_1000
489
+ value: 0.4759
490
+ - type: recall_at_3
491
+ value: 0.05673
492
+ - type: recall_at_5
493
+ value: 0.07545
494
+ - task:
495
+ type: Retrieval
496
+ dataset:
497
+ type: BeIR/cqadupstack
498
+ name: MTEB CQADupstackUnixRetrieval
499
+ metrics:
500
+ - type: map_at_1
501
+ value: 0.13843
502
+ - type: map_at_10
503
+ value: 0.17496
504
+ - type: map_at_100
505
+ value: 0.18304
506
+ - type: map_at_1000
507
+ value: 0.18426
508
+ - type: map_at_3
509
+ value: 0.16225
510
+ - type: map_at_5
511
+ value: 0.1683
512
+ - type: ndcg_at_1
513
+ value: 0.16698
514
+ - type: ndcg_at_10
515
+ value: 0.20301
516
+ - type: ndcg_at_100
517
+ value: 0.24523
518
+ - type: ndcg_at_1000
519
+ value: 0.27784
520
+ - type: ndcg_at_3
521
+ value: 0.17822
522
+ - type: ndcg_at_5
523
+ value: 0.18794
524
+ - type: precision_at_1
525
+ value: 0.16698
526
+ - type: precision_at_10
527
+ value: 0.03358
528
+ - type: precision_at_100
529
+ value: 0.00618
530
+ - type: precision_at_1000
531
+ value: 0.00101
532
+ - type: precision_at_3
533
+ value: 0.07898
534
+ - type: precision_at_5
535
+ value: 0.05429
536
+ - type: recall_at_1
537
+ value: 0.13843
538
+ - type: recall_at_10
539
+ value: 0.25888
540
+ - type: recall_at_100
541
+ value: 0.45028
542
+ - type: recall_at_1000
543
+ value: 0.68991
544
+ - type: recall_at_3
545
+ value: 0.18851
546
+ - type: recall_at_5
547
+ value: 0.21462
548
+ - task:
549
+ type: STS
550
+ dataset:
551
+ type: mteb/sts12-sts
552
+ name: MTEB STS12
553
+ metrics:
554
+ - type: cos_sim_pearson
555
+ value: 0.8020938796088339
556
+ - type: cos_sim_spearman
557
+ value: 0.6916914010333395
558
+ - type: euclidean_pearson
559
+ value: 0.7933415250097545
560
+ - type: euclidean_spearman
561
+ value: 0.7146707320292746
562
+ - type: manhattan_pearson
563
+ value: 0.7973669837981976
564
+ - type: manhattan_spearman
565
+ value: 0.7187919511134903
566
+ - task:
567
+ type: Clustering
568
+ dataset:
569
+ type: mteb/stackexchange-clustering
570
+ name: MTEB StackExchangeClustering
571
+ metrics:
572
+ - type: v_measure
573
+ value: 0.4459127540530939
574
+ - task:
575
+ type: Reranking
576
+ dataset:
577
+ type: mteb/scidocs-reranking
578
+ name: MTEB SciDocsRR
579
+ metrics:
580
+ - type: map
581
+ value: 0.6835710819755543
582
+ - type: mrr
583
+ value: 0.8805442832403617
584
+ - task:
585
+ type: Retrieval
586
+ dataset:
587
+ type: arguana
588
+ name: MTEB ArguAna
589
+ metrics:
590
+ - type: map_at_1
591
+ value: 0.13442
592
+ - type: map_at_10
593
+ value: 0.24275
594
+ - type: map_at_100
595
+ value: 0.25588
596
+ - type: map_at_1000
597
+ value: 0.25659
598
+ - type: map_at_3
599
+ value: 0.20092
600
+ - type: map_at_5
601
+ value: 0.2244
602
+ - type: ndcg_at_1
603
+ value: 0.13442
604
+ - type: ndcg_at_10
605
+ value: 0.3104
606
+ - type: ndcg_at_100
607
+ value: 0.37529
608
+ - type: ndcg_at_1000
609
+ value: 0.39348
610
+ - type: ndcg_at_3
611
+ value: 0.22342
612
+ - type: ndcg_at_5
613
+ value: 0.26596
614
+ - type: precision_at_1
615
+ value: 0.13442
616
+ - type: precision_at_10
617
+ value: 0.05299
618
+ - type: precision_at_100
619
+ value: 0.00836
620
+ - type: precision_at_1000
621
+ value: 0.00098
622
+ - type: precision_at_3
623
+ value: 0.09625
624
+ - type: precision_at_5
625
+ value: 0.07852
626
+ - type: recall_at_1
627
+ value: 0.13442
628
+ - type: recall_at_10
629
+ value: 0.52987
630
+ - type: recall_at_100
631
+ value: 0.83642
632
+ - type: recall_at_1000
633
+ value: 0.97795
634
+ - type: recall_at_3
635
+ value: 0.28876
636
+ - type: recall_at_5
637
+ value: 0.3926
638
+ - task:
639
+ type: Reranking
640
+ dataset:
641
+ type: mteb/askubuntudupquestions-reranking
642
+ name: MTEB AskUbuntuDupQuestions
643
+ metrics:
644
+ - type: map
645
+ value: 0.5263439984994702
646
+ - type: mrr
647
+ value: 0.6575704612408213
648
+ - task:
649
+ type: Classification
650
+ dataset:
651
+ type: mteb/tweet_sentiment_extraction
652
+ name: MTEB TweetSentimentExtractionClassification
653
+ metrics:
654
+ - type: accuracy
655
+ value: 0.5482173174872665
656
+ - type: f1
657
+ value: 0.5514729314789282
658
+ - task:
659
+ type: Clustering
660
+ dataset:
661
+ type: mteb/arxiv-clustering-s2s
662
+ name: MTEB ArxivClusteringS2S
663
+ metrics:
664
+ - type: v_measure
665
+ value: 0.2467870651472156
666
+ - task:
667
+ type: Retrieval
668
+ dataset:
669
+ type: hotpotqa
670
+ name: MTEB HotpotQA
671
+ metrics:
672
+ - type: map_at_1
673
+ value: 0.09676
674
+ - type: map_at_10
675
+ value: 0.13351
676
+ - type: map_at_100
677
+ value: 0.13919
678
+ - type: map_at_1000
679
+ value: 0.1401
680
+ - type: map_at_3
681
+ value: 0.12223
682
+ - type: map_at_5
683
+ value: 0.12812
684
+ - type: ndcg_at_1
685
+ value: 0.19352
686
+ - type: ndcg_at_10
687
+ value: 0.17727
688
+ - type: ndcg_at_100
689
+ value: 0.20837
690
+ - type: ndcg_at_1000
691
+ value: 0.23412
692
+ - type: ndcg_at_3
693
+ value: 0.15317
694
+ - type: ndcg_at_5
695
+ value: 0.16436
696
+ - type: precision_at_1
697
+ value: 0.19352
698
+ - type: precision_at_10
699
+ value: 0.03993
700
+ - type: precision_at_100
701
+ value: 0.00651
702
+ - type: precision_at_1000
703
+ value: 0.001
704
+ - type: precision_at_3
705
+ value: 0.09669
706
+ - type: precision_at_5
707
+ value: 0.0669
708
+ - type: recall_at_1
709
+ value: 0.09676
710
+ - type: recall_at_10
711
+ value: 0.19966
712
+ - type: recall_at_100
713
+ value: 0.32573
714
+ - type: recall_at_1000
715
+ value: 0.49905
716
+ - type: recall_at_3
717
+ value: 0.14504
718
+ - type: recall_at_5
719
+ value: 0.16725
720
+ - task:
721
+ type: Retrieval
722
+ dataset:
723
+ type: webis-touche2020
724
+ name: MTEB Touche2020
725
+ metrics:
726
+ - type: map_at_1
727
+ value: 0.00645
728
+ - type: map_at_10
729
+ value: 0.04116
730
+ - type: map_at_100
731
+ value: 0.07527
732
+ - type: map_at_1000
733
+ value: 0.08678
734
+ - type: map_at_3
735
+ value: 0.01602
736
+ - type: map_at_5
737
+ value: 0.026
738
+ - type: ndcg_at_1
739
+ value: 0.10204
740
+ - type: ndcg_at_10
741
+ value: 0.1227
742
+ - type: ndcg_at_100
743
+ value: 0.22461
744
+ - type: ndcg_at_1000
745
+ value: 0.33543
746
+ - type: ndcg_at_3
747
+ value: 0.09982
748
+ - type: ndcg_at_5
749
+ value: 0.11498
750
+ - type: precision_at_1
751
+ value: 0.10204
752
+ - type: precision_at_10
753
+ value: 0.12245
754
+ - type: precision_at_100
755
+ value: 0.05286
756
+ - type: precision_at_1000
757
+ value: 0.01263
758
+ - type: precision_at_3
759
+ value: 0.10884
760
+ - type: precision_at_5
761
+ value: 0.13061
762
+ - type: recall_at_1
763
+ value: 0.00645
764
+ - type: recall_at_10
765
+ value: 0.08996
766
+ - type: recall_at_100
767
+ value: 0.33666
768
+ - type: recall_at_1000
769
+ value: 0.67704
770
+ - type: recall_at_3
771
+ value: 0.02504
772
+ - type: recall_at_5
773
+ value: 0.0495
774
+ - task:
775
+ type: Retrieval
776
+ dataset:
777
+ type: BeIR/cqadupstack
778
+ name: MTEB CQADupstackAndroidRetrieval
779
+ metrics:
780
+ - type: map_at_1
781
+ value: 0.18222
782
+ - type: map_at_10
783
+ value: 0.24506
784
+ - type: map_at_100
785
+ value: 0.25611
786
+ - type: map_at_1000
787
+ value: 0.25758
788
+ - type: map_at_3
789
+ value: 0.22265
790
+ - type: map_at_5
791
+ value: 0.23698
792
+ - type: ndcg_at_1
793
+ value: 0.23033
794
+ - type: ndcg_at_10
795
+ value: 0.28719
796
+ - type: ndcg_at_100
797
+ value: 0.33748
798
+ - type: ndcg_at_1000
799
+ value: 0.37056
800
+ - type: ndcg_at_3
801
+ value: 0.2524
802
+ - type: ndcg_at_5
803
+ value: 0.2712
804
+ - type: precision_at_1
805
+ value: 0.23033
806
+ - type: precision_at_10
807
+ value: 0.05408
808
+ - type: precision_at_100
809
+ value: 0.01004
810
+ - type: precision_at_1000
811
+ value: 0.00158
812
+ - type: precision_at_3
813
+ value: 0.11874
814
+ - type: precision_at_5
815
+ value: 0.08927
816
+ - type: recall_at_1
817
+ value: 0.18222
818
+ - type: recall_at_10
819
+ value: 0.36355
820
+ - type: recall_at_100
821
+ value: 0.58724
822
+ - type: recall_at_1000
823
+ value: 0.81335
824
+ - type: recall_at_3
825
+ value: 0.26334
826
+ - type: recall_at_5
827
+ value: 0.314
828
+ - task:
829
+ type: Summarization
830
+ dataset:
831
+ type: mteb/summeval
832
+ name: MTEB SummEval
833
+ metrics:
834
+ - type: cos_sim_pearson
835
+ value: 0.3056303767714449
836
+ - type: cos_sim_spearman
837
+ value: 0.30256847004390486
838
+ - type: dot_pearson
839
+ value: 0.29453520030995006
840
+ - type: dot_spearman
841
+ value: 0.2956173255092678
842
+ - task:
843
+ type: Classification
844
+ dataset:
845
+ type: mteb/imdb
846
+ name: MTEB ImdbClassification
847
+ metrics:
848
+ - type: accuracy
849
+ value: 0.62896
850
+ - type: ap
851
+ value: 0.5847769349850157
852
+ - type: f1
853
+ value: 0.6267885149592086
854
+ - task:
855
+ type: STS
856
+ dataset:
857
+ type: mteb/sts15-sts
858
+ name: MTEB STS15
859
+ metrics:
860
+ - type: cos_sim_pearson
861
+ value: 0.7905293131911804
862
+ - type: cos_sim_spearman
863
+ value: 0.7973794782598049
864
+ - type: euclidean_pearson
865
+ value: 0.7817016171851057
866
+ - type: euclidean_spearman
867
+ value: 0.7876038607583106
868
+ - type: manhattan_pearson
869
+ value: 0.784994607532332
870
+ - type: manhattan_spearman
871
+ value: 0.7913026720132872
872
+ - task:
873
+ type: Clustering
874
+ dataset:
875
+ type: mteb/medrxiv-clustering-s2s
876
+ name: MTEB MedrxivClusteringS2S
877
+ metrics:
878
+ - type: v_measure
879
+ value: 0.24932123582259286
880
+ - task:
881
+ type: Retrieval
882
+ dataset:
883
+ type: climate-fever
884
+ name: MTEB ClimateFEVER
885
+ metrics:
886
+ - type: map_at_1
887
+ value: 0.03714
888
+ - type: map_at_10
889
+ value: 0.06926
890
+ - type: map_at_100
891
+ value: 0.07879
892
+ - type: map_at_1000
893
+ value: 0.08032
894
+ - type: map_at_3
895
+ value: 0.05504
896
+ - type: map_at_5
897
+ value: 0.06357
898
+ - type: ndcg_at_1
899
+ value: 0.0886
900
+ - type: ndcg_at_10
901
+ value: 0.11007
902
+ - type: ndcg_at_100
903
+ value: 0.16154
904
+ - type: ndcg_at_1000
905
+ value: 0.19668
906
+ - type: ndcg_at_3
907
+ value: 0.08103
908
+ - type: ndcg_at_5
909
+ value: 0.09456
910
+ - type: precision_at_1
911
+ value: 0.0886
912
+ - type: precision_at_10
913
+ value: 0.0372
914
+ - type: precision_at_100
915
+ value: 0.00917
916
+ - type: precision_at_1000
917
+ value: 0.00156
918
+ - type: precision_at_3
919
+ value: 0.06254
920
+ - type: precision_at_5
921
+ value: 0.05381
922
+ - type: recall_at_1
923
+ value: 0.03714
924
+ - type: recall_at_10
925
+ value: 0.14382
926
+ - type: recall_at_100
927
+ value: 0.33166
928
+ - type: recall_at_1000
929
+ value: 0.53444
930
+ - type: recall_at_3
931
+ value: 0.07523
932
+ - type: recall_at_5
933
+ value: 0.1091
934
+ - task:
935
+ type: STS
936
+ dataset:
937
+ type: mteb/sts14-sts
938
+ name: MTEB STS14
939
+ metrics:
940
+ - type: cos_sim_pearson
941
+ value: 0.7535551963935667
942
+ - type: cos_sim_spearman
943
+ value: 0.7098892671568665
944
+ - type: euclidean_pearson
945
+ value: 0.7324467338564629
946
+ - type: euclidean_spearman
947
+ value: 0.7197533151639425
948
+ - type: manhattan_pearson
949
+ value: 0.7327765593599381
950
+ - type: manhattan_spearman
951
+ value: 0.722221421456084
952
+ - task:
953
+ type: Retrieval
954
+ dataset:
955
+ type: BeIR/cqadupstack
956
+ name: MTEB CQADupstackEnglishRetrieval
957
+ metrics:
958
+ - type: map_at_1
959
+ value: 0.12058
960
+ - type: map_at_10
961
+ value: 0.16051
962
+ - type: map_at_100
963
+ value: 0.16772
964
+ - type: map_at_1000
965
+ value: 0.16871
966
+ - type: map_at_3
967
+ value: 0.1478
968
+ - type: map_at_5
969
+ value: 0.155
970
+ - type: ndcg_at_1
971
+ value: 0.1535
972
+ - type: ndcg_at_10
973
+ value: 0.18804
974
+ - type: ndcg_at_100
975
+ value: 0.22346
976
+ - type: ndcg_at_1000
977
+ value: 0.25007
978
+ - type: ndcg_at_3
979
+ value: 0.16768
980
+ - type: ndcg_at_5
981
+ value: 0.17692
982
+ - type: precision_at_1
983
+ value: 0.1535
984
+ - type: precision_at_10
985
+ value: 0.0351
986
+ - type: precision_at_100
987
+ value: 0.00664
988
+ - type: precision_at_1000
989
+ value: 0.00111
990
+ - type: precision_at_3
991
+ value: 0.07983
992
+ - type: precision_at_5
993
+ value: 0.05656
994
+ - type: recall_at_1
995
+ value: 0.12058
996
+ - type: recall_at_10
997
+ value: 0.23644
998
+ - type: recall_at_100
999
+ value: 0.3976
1000
+ - type: recall_at_1000
1001
+ value: 0.5856
1002
+ - type: recall_at_3
1003
+ value: 0.17542
1004
+ - type: recall_at_5
1005
+ value: 0.20232
1006
+ - task:
1007
+ type: Retrieval
1008
+ dataset:
1009
+ type: BeIR/cqadupstack
1010
+ name: MTEB CQADupstackGamingRetrieval
1011
+ metrics:
1012
+ - type: map_at_1
1013
+ value: 0.21183
1014
+ - type: map_at_10
1015
+ value: 0.289
1016
+ - type: map_at_100
1017
+ value: 0.29858
1018
+ - type: map_at_1000
1019
+ value: 0.29954
1020
+ - type: map_at_3
1021
+ value: 0.2658
1022
+ - type: map_at_5
1023
+ value: 0.27912
1024
+ - type: ndcg_at_1
1025
+ value: 0.24765
1026
+ - type: ndcg_at_10
1027
+ value: 0.3334
1028
+ - type: ndcg_at_100
1029
+ value: 0.37997
1030
+ - type: ndcg_at_1000
1031
+ value: 0.40416
1032
+ - type: ndcg_at_3
1033
+ value: 0.29045
1034
+ - type: ndcg_at_5
1035
+ value: 0.31121
1036
+ - type: precision_at_1
1037
+ value: 0.24765
1038
+ - type: precision_at_10
1039
+ value: 0.05599
1040
+ - type: precision_at_100
1041
+ value: 0.0087
1042
+ - type: precision_at_1000
1043
+ value: 0.00115
1044
+ - type: precision_at_3
1045
+ value: 0.13271
1046
+ - type: precision_at_5
1047
+ value: 0.09367
1048
+ - type: recall_at_1
1049
+ value: 0.21183
1050
+ - type: recall_at_10
1051
+ value: 0.43875
1052
+ - type: recall_at_100
1053
+ value: 0.65005
1054
+ - type: recall_at_1000
1055
+ value: 0.83017
1056
+ - type: recall_at_3
1057
+ value: 0.32232
1058
+ - type: recall_at_5
1059
+ value: 0.37308
1060
+ - task:
1061
+ type: Retrieval
1062
+ dataset:
1063
+ type: fiqa
1064
+ name: MTEB FiQA2018
1065
+ metrics:
1066
+ - type: map_at_1
1067
+ value: 0.03637
1068
+ - type: map_at_10
1069
+ value: 0.06084
1070
+ - type: map_at_100
1071
+ value: 0.06919
1072
+ - type: map_at_1000
1073
+ value: 0.07108
1074
+ - type: map_at_3
1075
+ value: 0.05071
1076
+ - type: map_at_5
1077
+ value: 0.05565
1078
+ - type: ndcg_at_1
1079
+ value: 0.07407
1080
+ - type: ndcg_at_10
1081
+ value: 0.0894
1082
+ - type: ndcg_at_100
1083
+ value: 0.13595
1084
+ - type: ndcg_at_1000
1085
+ value: 0.1829
1086
+ - type: ndcg_at_3
1087
+ value: 0.07393
1088
+ - type: ndcg_at_5
1089
+ value: 0.07854
1090
+ - type: precision_at_1
1091
+ value: 0.07407
1092
+ - type: precision_at_10
1093
+ value: 0.02778
1094
+ - type: precision_at_100
1095
+ value: 0.0075
1096
+ - type: precision_at_1000
1097
+ value: 0.00154
1098
+ - type: precision_at_3
1099
+ value: 0.05144
1100
+ - type: precision_at_5
1101
+ value: 0.03981
1102
+ - type: recall_at_1
1103
+ value: 0.03637
1104
+ - type: recall_at_10
1105
+ value: 0.11821
1106
+ - type: recall_at_100
1107
+ value: 0.3018
1108
+ - type: recall_at_1000
1109
+ value: 0.60207
1110
+ - type: recall_at_3
1111
+ value: 0.06839
1112
+ - type: recall_at_5
1113
+ value: 0.08649
1114
+ - task:
1115
+ type: Classification
1116
+ dataset:
1117
+ type: mteb/amazon_massive_intent
1118
+ name: MTEB MassiveIntentClassification (af)
1119
+ metrics:
1120
+ - type: accuracy
1121
+ value: 0.3779421654337593
1122
+ - type: f1
1123
+ value: 0.3681580701507746
1124
+ - task:
1125
+ type: Classification
1126
+ dataset:
1127
+ type: mteb/amazon_massive_intent
1128
+ name: MTEB MassiveIntentClassification (am)
1129
+ metrics:
1130
+ - type: accuracy
1131
+ value: 0.23722259583053126
1132
+ - type: f1
1133
+ value: 0.23235269695764274
1134
+ - task:
1135
+ type: Classification
1136
+ dataset:
1137
+ type: mteb/amazon_massive_intent
1138
+ name: MTEB MassiveIntentClassification (ar)
1139
+ metrics:
1140
+ - type: accuracy
1141
+ value: 0.2964021519838601
1142
+ - type: f1
1143
+ value: 0.28273175327650135
1144
+ - task:
1145
+ type: Classification
1146
+ dataset:
1147
+ type: mteb/amazon_massive_intent
1148
+ name: MTEB MassiveIntentClassification (az)
1149
+ metrics:
1150
+ - type: accuracy
1151
+ value: 0.39475453934095495
1152
+ - type: f1
1153
+ value: 0.39259973614151206
1154
+ - task:
1155
+ type: Classification
1156
+ dataset:
1157
+ type: mteb/amazon_massive_intent
1158
+ name: MTEB MassiveIntentClassification (bn)
1159
+ metrics:
1160
+ - type: accuracy
1161
+ value: 0.26550100874243443
1162
+ - type: f1
1163
+ value: 0.25607924873522975
1164
+ - task:
1165
+ type: Classification
1166
+ dataset:
1167
+ type: mteb/amazon_massive_intent
1168
+ name: MTEB MassiveIntentClassification (cy)
1169
+ metrics:
1170
+ - type: accuracy
1171
+ value: 0.38782784129119036
1172
+ - type: f1
1173
+ value: 0.3764180582626517
1174
+ - task:
1175
+ type: Classification
1176
+ dataset:
1177
+ type: mteb/amazon_massive_intent
1178
+ name: MTEB MassiveIntentClassification (da)
1179
+ metrics:
1180
+ - type: accuracy
1181
+ value: 0.43557498318762605
1182
+ - type: f1
1183
+ value: 0.4135305173800667
1184
+ - task:
1185
+ type: Classification
1186
+ dataset:
1187
+ type: mteb/amazon_massive_intent
1188
+ name: MTEB MassiveIntentClassification (de)
1189
+ metrics:
1190
+ - type: accuracy
1191
+ value: 0.4039340954942838
1192
+ - type: f1
1193
+ value: 0.38333932195289344
1194
+ - task:
1195
+ type: Classification
1196
+ dataset:
1197
+ type: mteb/amazon_massive_intent
1198
+ name: MTEB MassiveIntentClassification (el)
1199
+ metrics:
1200
+ - type: accuracy
1201
+ value: 0.3728648285137861
1202
+ - type: f1
1203
+ value: 0.36640059066802844
1204
+ - task:
1205
+ type: Classification
1206
+ dataset:
1207
+ type: mteb/amazon_massive_intent
1208
+ name: MTEB MassiveIntentClassification (en)
1209
+ metrics:
1210
+ - type: accuracy
1211
+ value: 0.5808002689979825
1212
+ - type: f1
1213
+ value: 0.5649243881660991
1214
+ - task:
1215
+ type: Classification
1216
+ dataset:
1217
+ type: mteb/amazon_massive_intent
1218
+ name: MTEB MassiveIntentClassification (es)
1219
+ metrics:
1220
+ - type: accuracy
1221
+ value: 0.411768661735037
1222
+ - type: f1
1223
+ value: 0.4066779962225799
1224
+ - task:
1225
+ type: Classification
1226
+ dataset:
1227
+ type: mteb/amazon_massive_intent
1228
+ name: MTEB MassiveIntentClassification (fa)
1229
+ metrics:
1230
+ - type: accuracy
1231
+ value: 0.36422326832548757
1232
+ - type: f1
1233
+ value: 0.34644173804288503
1234
+ - task:
1235
+ type: Classification
1236
+ dataset:
1237
+ type: mteb/amazon_massive_intent
1238
+ name: MTEB MassiveIntentClassification (fi)
1239
+ metrics:
1240
+ - type: accuracy
1241
+ value: 0.3875588433086752
1242
+ - type: f1
1243
+ value: 0.3726725894668694
1244
+ - task:
1245
+ type: Classification
1246
+ dataset:
1247
+ type: mteb/amazon_massive_intent
1248
+ name: MTEB MassiveIntentClassification (fr)
1249
+ metrics:
1250
+ - type: accuracy
1251
+ value: 0.43671822461331533
1252
+ - type: f1
1253
+ value: 0.423518466245666
1254
+ - task:
1255
+ type: Classification
1256
+ dataset:
1257
+ type: mteb/amazon_massive_intent
1258
+ name: MTEB MassiveIntentClassification (he)
1259
+ metrics:
1260
+ - type: accuracy
1261
+ value: 0.3198049764626766
1262
+ - type: f1
1263
+ value: 0.3055792887280901
1264
+ - task:
1265
+ type: Classification
1266
+ dataset:
1267
+ type: mteb/amazon_massive_intent
1268
+ name: MTEB MassiveIntentClassification (hi)
1269
+ metrics:
1270
+ - type: accuracy
1271
+ value: 0.2803967720242098
1272
+ - type: f1
1273
+ value: 0.28428418145508305
1274
+ - task:
1275
+ type: Classification
1276
+ dataset:
1277
+ type: mteb/amazon_massive_intent
1278
+ name: MTEB MassiveIntentClassification (hu)
1279
+ metrics:
1280
+ - type: accuracy
1281
+ value: 0.3813718897108272
1282
+ - type: f1
1283
+ value: 0.3705740698819687
1284
+ - task:
1285
+ type: Classification
1286
+ dataset:
1287
+ type: mteb/amazon_massive_intent
1288
+ name: MTEB MassiveIntentClassification (hy)
1289
+ metrics:
1290
+ - type: accuracy
1291
+ value: 0.2605245460659045
1292
+ - type: f1
1293
+ value: 0.2525483953344816
1294
+ - task:
1295
+ type: Classification
1296
+ dataset:
1297
+ type: mteb/amazon_massive_intent
1298
+ name: MTEB MassiveIntentClassification (id)
1299
+ metrics:
1300
+ - type: accuracy
1301
+ value: 0.41156691324815065
1302
+ - type: f1
1303
+ value: 0.40837150332476047
1304
+ - task:
1305
+ type: Classification
1306
+ dataset:
1307
+ type: mteb/amazon_massive_intent
1308
+ name: MTEB MassiveIntentClassification (is)
1309
+ metrics:
1310
+ - type: accuracy
1311
+ value: 0.38628110289172835
1312
+ - type: f1
1313
+ value: 0.37676919012460314
1314
+ - task:
1315
+ type: Classification
1316
+ dataset:
1317
+ type: mteb/amazon_massive_intent
1318
+ name: MTEB MassiveIntentClassification (it)
1319
+ metrics:
1320
+ - type: accuracy
1321
+ value: 0.440383322125084
1322
+ - type: f1
1323
+ value: 0.43772590108774556
1324
+ - task:
1325
+ type: Classification
1326
+ dataset:
1327
+ type: mteb/amazon_massive_intent
1328
+ name: MTEB MassiveIntentClassification (ja)
1329
+ metrics:
1330
+ - type: accuracy
1331
+ value: 0.46207128446536655
1332
+ - type: f1
1333
+ value: 0.44666328759408236
1334
+ - task:
1335
+ type: Classification
1336
+ dataset:
1337
+ type: mteb/amazon_massive_intent
1338
+ name: MTEB MassiveIntentClassification (jv)
1339
+ metrics:
1340
+ - type: accuracy
1341
+ value: 0.3760591795561533
1342
+ - type: f1
1343
+ value: 0.36581071742378013
1344
+ - task:
1345
+ type: Classification
1346
+ dataset:
1347
+ type: mteb/amazon_massive_intent
1348
+ name: MTEB MassiveIntentClassification (ka)
1349
+ metrics:
1350
+ - type: accuracy
1351
+ value: 0.24472091459314052
1352
+ - type: f1
1353
+ value: 0.24238209697895607
1354
+ - task:
1355
+ type: Classification
1356
+ dataset:
1357
+ type: mteb/amazon_massive_intent
1358
+ name: MTEB MassiveIntentClassification (km)
1359
+ metrics:
1360
+ - type: accuracy
1361
+ value: 0.2623739071956961
1362
+ - type: f1
1363
+ value: 0.2537878315084505
1364
+ - task:
1365
+ type: Classification
1366
+ dataset:
1367
+ type: mteb/amazon_massive_intent
1368
+ name: MTEB MassiveIntentClassification (kn)
1369
+ metrics:
1370
+ - type: accuracy
1371
+ value: 0.17831203765971754
1372
+ - type: f1
1373
+ value: 0.17275078420466344
1374
+ - task:
1375
+ type: Classification
1376
+ dataset:
1377
+ type: mteb/amazon_massive_intent
1378
+ name: MTEB MassiveIntentClassification (ko)
1379
+ metrics:
1380
+ - type: accuracy
1381
+ value: 0.37266308002689974
1382
+ - type: f1
1383
+ value: 0.3692473791708214
1384
+ - task:
1385
+ type: Classification
1386
+ dataset:
1387
+ type: mteb/amazon_massive_intent
1388
+ name: MTEB MassiveIntentClassification (lv)
1389
+ metrics:
1390
+ - type: accuracy
1391
+ value: 0.4093140551445864
1392
+ - type: f1
1393
+ value: 0.4082522788964197
1394
+ - task:
1395
+ type: Classification
1396
+ dataset:
1397
+ type: mteb/amazon_massive_intent
1398
+ name: MTEB MassiveIntentClassification (ml)
1399
+ metrics:
1400
+ - type: accuracy
1401
+ value: 0.1788500336247478
1402
+ - type: f1
1403
+ value: 0.17621569082971816
1404
+ - task:
1405
+ type: Classification
1406
+ dataset:
1407
+ type: mteb/amazon_massive_intent
1408
+ name: MTEB MassiveIntentClassification (mn)
1409
+ metrics:
1410
+ - type: accuracy
1411
+ value: 0.3297579018157364
1412
+ - type: f1
1413
+ value: 0.33402014633349664
1414
+ - task:
1415
+ type: Classification
1416
+ dataset:
1417
+ type: mteb/amazon_massive_intent
1418
+ name: MTEB MassiveIntentClassification (ms)
1419
+ metrics:
1420
+ - type: accuracy
1421
+ value: 0.40911230665770015
1422
+ - type: f1
1423
+ value: 0.4009538559124075
1424
+ - task:
1425
+ type: Classification
1426
+ dataset:
1427
+ type: mteb/amazon_massive_intent
1428
+ name: MTEB MassiveIntentClassification (my)
1429
+ metrics:
1430
+ - type: accuracy
1431
+ value: 0.17834566240753194
1432
+ - type: f1
1433
+ value: 0.17006381849454313
1434
+ - task:
1435
+ type: Classification
1436
+ dataset:
1437
+ type: mteb/amazon_massive_intent
1438
+ name: MTEB MassiveIntentClassification (nb)
1439
+ metrics:
1440
+ - type: accuracy
1441
+ value: 0.3947881640887693
1442
+ - type: f1
1443
+ value: 0.37819934317839304
1444
+ - task:
1445
+ type: Classification
1446
+ dataset:
1447
+ type: mteb/amazon_massive_intent
1448
+ name: MTEB MassiveIntentClassification (nl)
1449
+ metrics:
1450
+ - type: accuracy
1451
+ value: 0.4176193678547412
1452
+ - type: f1
1453
+ value: 0.40281991759509694
1454
+ - task:
1455
+ type: Classification
1456
+ dataset:
1457
+ type: mteb/amazon_massive_intent
1458
+ name: MTEB MassiveIntentClassification (pl)
1459
+ metrics:
1460
+ - type: accuracy
1461
+ value: 0.4261936785474109
1462
+ - type: f1
1463
+ value: 0.40836739146499046
1464
+ - task:
1465
+ type: Classification
1466
+ dataset:
1467
+ type: mteb/amazon_massive_intent
1468
+ name: MTEB MassiveIntentClassification (pt)
1469
+ metrics:
1470
+ - type: accuracy
1471
+ value: 0.44542703429724273
1472
+ - type: f1
1473
+ value: 0.43452431642784484
1474
+ - task:
1475
+ type: Classification
1476
+ dataset:
1477
+ type: mteb/amazon_massive_intent
1478
+ name: MTEB MassiveIntentClassification (ro)
1479
+ metrics:
1480
+ - type: accuracy
1481
+ value: 0.3996973772696705
1482
+ - type: f1
1483
+ value: 0.3874209466530094
1484
+ - task:
1485
+ type: Classification
1486
+ dataset:
1487
+ type: mteb/amazon_massive_intent
1488
+ name: MTEB MassiveIntentClassification (ru)
1489
+ metrics:
1490
+ - type: accuracy
1491
+ value: 0.37461331540013454
1492
+ - type: f1
1493
+ value: 0.3691132021821187
1494
+ - task:
1495
+ type: Classification
1496
+ dataset:
1497
+ type: mteb/amazon_massive_intent
1498
+ name: MTEB MassiveIntentClassification (sl)
1499
+ metrics:
1500
+ - type: accuracy
1501
+ value: 0.3828850033624748
1502
+ - type: f1
1503
+ value: 0.3737259394049676
1504
+ - task:
1505
+ type: Classification
1506
+ dataset:
1507
+ type: mteb/amazon_massive_intent
1508
+ name: MTEB MassiveIntentClassification (sq)
1509
+ metrics:
1510
+ - type: accuracy
1511
+ value: 0.4095494283792872
1512
+ - type: f1
1513
+ value: 0.3976770790286908
1514
+ - task:
1515
+ type: Classification
1516
+ dataset:
1517
+ type: mteb/amazon_massive_intent
1518
+ name: MTEB MassiveIntentClassification (sv)
1519
+ metrics:
1520
+ - type: accuracy
1521
+ value: 0.4185272360457296
1522
+ - type: f1
1523
+ value: 0.4042848260365438
1524
+ - task:
1525
+ type: Classification
1526
+ dataset:
1527
+ type: mteb/amazon_massive_intent
1528
+ name: MTEB MassiveIntentClassification (sw)
1529
+ metrics:
1530
+ - type: accuracy
1531
+ value: 0.3832885003362475
1532
+ - type: f1
1533
+ value: 0.3690334596675622
1534
+ - task:
1535
+ type: Classification
1536
+ dataset:
1537
+ type: mteb/amazon_massive_intent
1538
+ name: MTEB MassiveIntentClassification (ta)
1539
+ metrics:
1540
+ - type: accuracy
1541
+ value: 0.19031607262945527
1542
+ - type: f1
1543
+ value: 0.18665103063257613
1544
+ - task:
1545
+ type: Classification
1546
+ dataset:
1547
+ type: mteb/amazon_massive_intent
1548
+ name: MTEB MassiveIntentClassification (te)
1549
+ metrics:
1550
+ - type: accuracy
1551
+ value: 0.1938466711499664
1552
+ - type: f1
1553
+ value: 0.19186399376652535
1554
+ - task:
1555
+ type: Classification
1556
+ dataset:
1557
+ type: mteb/amazon_massive_intent
1558
+ name: MTEB MassiveIntentClassification (th)
1559
+ metrics:
1560
+ - type: accuracy
1561
+ value: 0.34088769334229996
1562
+ - type: f1
1563
+ value: 0.3420383086009429
1564
+ - task:
1565
+ type: Classification
1566
+ dataset:
1567
+ type: mteb/amazon_massive_intent
1568
+ name: MTEB MassiveIntentClassification (tl)
1569
+ metrics:
1570
+ - type: accuracy
1571
+ value: 0.40285810356422325
1572
+ - type: f1
1573
+ value: 0.39361500249640413
1574
+ - task:
1575
+ type: Classification
1576
+ dataset:
1577
+ type: mteb/amazon_massive_intent
1578
+ name: MTEB MassiveIntentClassification (tr)
1579
+ metrics:
1580
+ - type: accuracy
1581
+ value: 0.38860121049092133
1582
+ - type: f1
1583
+ value: 0.3781916859627235
1584
+ - task:
1585
+ type: Classification
1586
+ dataset:
1587
+ type: mteb/amazon_massive_intent
1588
+ name: MTEB MassiveIntentClassification (ur)
1589
+ metrics:
1590
+ - type: accuracy
1591
+ value: 0.27834566240753195
1592
+ - type: f1
1593
+ value: 0.26898389386106486
1594
+ - task:
1595
+ type: Classification
1596
+ dataset:
1597
+ type: mteb/amazon_massive_intent
1598
+ name: MTEB MassiveIntentClassification (vi)
1599
+ metrics:
1600
+ - type: accuracy
1601
+ value: 0.38705447209145927
1602
+ - type: f1
1603
+ value: 0.3828002644202441
1604
+ - task:
1605
+ type: Classification
1606
+ dataset:
1607
+ type: mteb/amazon_massive_intent
1608
+ name: MTEB MassiveIntentClassification (zh-CN)
1609
+ metrics:
1610
+ - type: accuracy
1611
+ value: 0.45780094149293876
1612
+ - type: f1
1613
+ value: 0.4421526778674136
1614
+ - task:
1615
+ type: Classification
1616
+ dataset:
1617
+ type: mteb/amazon_massive_intent
1618
+ name: MTEB MassiveIntentClassification (zh-TW)
1619
+ metrics:
1620
+ - type: accuracy
1621
+ value: 0.4232010759919301
1622
+ - type: f1
1623
+ value: 0.4225772977490916
1624
+ - task:
1625
+ type: Classification
1626
+ dataset:
1627
+ type: mteb/amazon_polarity
1628
+ name: MTEB AmazonPolarityClassification
1629
+ metrics:
1630
+ - type: accuracy
1631
+ value: 0.74938225
1632
+ - type: ap
1633
+ value: 0.6958187110320567
1634
+ - type: f1
1635
+ value: 0.7472744058439321
1636
+ - task:
1637
+ type: Retrieval
1638
+ dataset:
1639
+ type: dbpedia-entity
1640
+ name: MTEB DBPedia
1641
+ metrics:
1642
+ - type: map_at_1
1643
+ value: 0.01764
1644
+ - type: map_at_10
1645
+ value: 0.0386
1646
+ - type: map_at_100
1647
+ value: 0.05457
1648
+ - type: map_at_1000
1649
+ value: 0.05938
1650
+ - type: map_at_3
1651
+ value: 0.02667
1652
+ - type: map_at_5
1653
+ value: 0.0322
1654
+ - type: ndcg_at_1
1655
+ value: 0.14
1656
+ - type: ndcg_at_10
1657
+ value: 0.10868
1658
+ - type: ndcg_at_100
1659
+ value: 0.12866
1660
+ - type: ndcg_at_1000
1661
+ value: 0.1743
1662
+ - type: ndcg_at_3
1663
+ value: 0.11943
1664
+ - type: ndcg_at_5
1665
+ value: 0.1166
1666
+ - type: precision_at_1
1667
+ value: 0.1925
1668
+ - type: precision_at_10
1669
+ value: 0.10275
1670
+ - type: precision_at_100
1671
+ value: 0.03527
1672
+ - type: precision_at_1000
1673
+ value: 0.00912
1674
+ - type: precision_at_3
1675
+ value: 0.14917
1676
+ - type: precision_at_5
1677
+ value: 0.135
1678
+ - type: recall_at_1
1679
+ value: 0.01764
1680
+ - type: recall_at_10
1681
+ value: 0.06609
1682
+ - type: recall_at_100
1683
+ value: 0.17616
1684
+ - type: recall_at_1000
1685
+ value: 0.33085
1686
+ - type: recall_at_3
1687
+ value: 0.03115
1688
+ - type: recall_at_5
1689
+ value: 0.04605
1690
+ - task:
1691
+ type: Retrieval
1692
+ dataset:
1693
+ type: fever
1694
+ name: MTEB FEVER
1695
+ metrics:
1696
+ - type: map_at_1
1697
+ value: 0.11497
1698
+ - type: map_at_10
1699
+ value: 0.15744
1700
+ - type: map_at_100
1701
+ value: 0.163
1702
+ - type: map_at_1000
1703
+ value: 0.16365
1704
+ - type: map_at_3
1705
+ value: 0.1444
1706
+ - type: map_at_5
1707
+ value: 0.1518
1708
+ - type: ndcg_at_1
1709
+ value: 0.12346
1710
+ - type: ndcg_at_10
1711
+ value: 0.18399
1712
+ - type: ndcg_at_100
1713
+ value: 0.21399
1714
+ - type: ndcg_at_1000
1715
+ value: 0.23442
1716
+ - type: ndcg_at_3
1717
+ value: 0.15695
1718
+ - type: ndcg_at_5
1719
+ value: 0.17027
1720
+ - type: precision_at_1
1721
+ value: 0.12346
1722
+ - type: precision_at_10
1723
+ value: 0.02798
1724
+ - type: precision_at_100
1725
+ value: 0.00445
1726
+ - type: precision_at_1000
1727
+ value: 0.00063
1728
+ - type: precision_at_3
1729
+ value: 0.06586
1730
+ - type: precision_at_5
1731
+ value: 0.04665
1732
+ - type: recall_at_1
1733
+ value: 0.11497
1734
+ - type: recall_at_10
1735
+ value: 0.25636
1736
+ - type: recall_at_100
1737
+ value: 0.39894
1738
+ - type: recall_at_1000
1739
+ value: 0.56181
1740
+ - type: recall_at_3
1741
+ value: 0.18273
1742
+ - type: recall_at_5
1743
+ value: 0.21474
1744
+ - task:
1745
+ type: Retrieval
1746
+ dataset:
1747
+ type: BeIR/cqadupstack
1748
+ name: MTEB CQADupstackProgrammersRetrieval
1749
+ metrics:
1750
+ - type: map_at_1
1751
+ value: 0.12598
1752
+ - type: map_at_10
1753
+ value: 0.17304
1754
+ - type: map_at_100
1755
+ value: 0.18209
1756
+ - type: map_at_1000
1757
+ value: 0.18328
1758
+ - type: map_at_3
1759
+ value: 0.15784
1760
+ - type: map_at_5
1761
+ value: 0.1667
1762
+ - type: ndcg_at_1
1763
+ value: 0.15868
1764
+ - type: ndcg_at_10
1765
+ value: 0.20623
1766
+ - type: ndcg_at_100
1767
+ value: 0.25093
1768
+ - type: ndcg_at_1000
1769
+ value: 0.28498
1770
+ - type: ndcg_at_3
1771
+ value: 0.17912
1772
+ - type: ndcg_at_5
1773
+ value: 0.19198
1774
+ - type: precision_at_1
1775
+ value: 0.15868
1776
+ - type: precision_at_10
1777
+ value: 0.03767
1778
+ - type: precision_at_100
1779
+ value: 0.00716
1780
+ - type: precision_at_1000
1781
+ value: 0.00118
1782
+ - type: precision_at_3
1783
+ value: 0.08638
1784
+ - type: precision_at_5
1785
+ value: 0.0621
1786
+ - type: recall_at_1
1787
+ value: 0.12598
1788
+ - type: recall_at_10
1789
+ value: 0.27144
1790
+ - type: recall_at_100
1791
+ value: 0.46817
1792
+ - type: recall_at_1000
1793
+ value: 0.71861
1794
+ - type: recall_at_3
1795
+ value: 0.19231
1796
+ - type: recall_at_5
1797
+ value: 0.22716
1798
+ - task:
1799
+ type: STS
1800
+ dataset:
1801
+ type: mteb/sts22-crosslingual-sts
1802
+ name: MTEB STS22 (en)
1803
+ metrics:
1804
+ - type: cos_sim_pearson
1805
+ value: 0.5917638344661753
1806
+ - type: cos_sim_spearman
1807
+ value: 0.5963676007113087
1808
+ - type: euclidean_pearson
1809
+ value: 0.5668753290255448
1810
+ - type: euclidean_spearman
1811
+ value: 0.5761328025857448
1812
+ - type: manhattan_pearson
1813
+ value: 0.5692312052723706
1814
+ - type: manhattan_spearman
1815
+ value: 0.5776774918418505
1816
+ - task:
1817
+ type: STS
1818
+ dataset:
1819
+ type: mteb/sts22-crosslingual-sts
1820
+ name: MTEB STS22 (de)
1821
+ metrics:
1822
+ - type: cos_sim_pearson
1823
+ value: 0.10322254716987457
1824
+ - type: cos_sim_spearman
1825
+ value: 0.110033092996862
1826
+ - type: euclidean_pearson
1827
+ value: 0.06006926471684402
1828
+ - type: euclidean_spearman
1829
+ value: 0.10972140246688376
1830
+ - type: manhattan_pearson
1831
+ value: 0.05933298751861177
1832
+ - type: manhattan_spearman
1833
+ value: 0.11030111585680233
1834
+ - task:
1835
+ type: STS
1836
+ dataset:
1837
+ type: mteb/sts22-crosslingual-sts
1838
+ name: MTEB STS22 (es)
1839
+ metrics:
1840
+ - type: cos_sim_pearson
1841
+ value: 0.4338031880545056
1842
+ - type: cos_sim_spearman
1843
+ value: 0.4305358201410913
1844
+ - type: euclidean_pearson
1845
+ value: 0.42723271963625525
1846
+ - type: euclidean_spearman
1847
+ value: 0.4255163899944477
1848
+ - type: manhattan_pearson
1849
+ value: 0.44015574997805873
1850
+ - type: manhattan_spearman
1851
+ value: 0.43124732216158546
1852
+ - task:
1853
+ type: STS
1854
+ dataset:
1855
+ type: mteb/sts22-crosslingual-sts
1856
+ name: MTEB STS22 (pl)
1857
+ metrics:
1858
+ - type: cos_sim_pearson
1859
+ value: 0.042912905043631364
1860
+ - type: cos_sim_spearman
1861
+ value: 0.1491272748789348
1862
+ - type: euclidean_pearson
1863
+ value: 0.032855132112394485
1864
+ - type: euclidean_spearman
1865
+ value: 0.16575204463951024
1866
+ - type: manhattan_pearson
1867
+ value: 0.03239877672346581
1868
+ - type: manhattan_spearman
1869
+ value: 0.16841985772913856
1870
+ - task:
1871
+ type: STS
1872
+ dataset:
1873
+ type: mteb/sts22-crosslingual-sts
1874
+ name: MTEB STS22 (tr)
1875
+ metrics:
1876
+ - type: cos_sim_pearson
1877
+ value: 0.041027394985558165
1878
+ - type: cos_sim_spearman
1879
+ value: 0.03818238576547375
1880
+ - type: euclidean_pearson
1881
+ value: 0.023181033496453556
1882
+ - type: euclidean_spearman
1883
+ value: 0.051826811802703564
1884
+ - type: manhattan_pearson
1885
+ value: 0.04800617926525645
1886
+ - type: manhattan_spearman
1887
+ value: 0.06738401400306251
1888
+ - task:
1889
+ type: STS
1890
+ dataset:
1891
+ type: mteb/sts22-crosslingual-sts
1892
+ name: MTEB STS22 (ar)
1893
+ metrics:
1894
+ - type: cos_sim_pearson
1895
+ value: 0.0238765395226737
1896
+ - type: cos_sim_spearman
1897
+ value: 0.051738993911623274
1898
+ - type: euclidean_pearson
1899
+ value: 0.030710263954769824
1900
+ - type: euclidean_spearman
1901
+ value: 0.050492229090398195
1902
+ - type: manhattan_pearson
1903
+ value: 0.0378263141098617
1904
+ - type: manhattan_spearman
1905
+ value: 0.05042238232170212
1906
+ - task:
1907
+ type: STS
1908
+ dataset:
1909
+ type: mteb/sts22-crosslingual-sts
1910
+ name: MTEB STS22 (ru)
1911
+ metrics:
1912
+ - type: cos_sim_pearson
1913
+ value: 0.07673549067267635
1914
+ - type: cos_sim_spearman
1915
+ value: 0.03363121525687889
1916
+ - type: euclidean_pearson
1917
+ value: 0.0464331702652217
1918
+ - type: euclidean_spearman
1919
+ value: 0.036129205171334326
1920
+ - type: manhattan_pearson
1921
+ value: 0.040112317360761963
1922
+ - type: manhattan_spearman
1923
+ value: 0.03233959766173701
1924
+ - task:
1925
+ type: STS
1926
+ dataset:
1927
+ type: mteb/sts22-crosslingual-sts
1928
+ name: MTEB STS22 (zh)
1929
+ metrics:
1930
+ - type: cos_sim_pearson
1931
+ value: 0.0006167614416104335
1932
+ - type: cos_sim_spearman
1933
+ value: 0.06521685391703255
1934
+ - type: euclidean_pearson
1935
+ value: 0.048845725790690325
1936
+ - type: euclidean_spearman
1937
+ value: 0.0559058032900239
1938
+ - type: manhattan_pearson
1939
+ value: 0.06139838096573896
1940
+ - type: manhattan_spearman
1941
+ value: 0.050060884837066215
1942
+ - task:
1943
+ type: STS
1944
+ dataset:
1945
+ type: mteb/sts22-crosslingual-sts
1946
+ name: MTEB STS22 (fr)
1947
+ metrics:
1948
+ - type: cos_sim_pearson
1949
+ value: 0.5319490347682836
1950
+ - type: cos_sim_spearman
1951
+ value: 0.5456055727079527
1952
+ - type: euclidean_pearson
1953
+ value: 0.5255574442039842
1954
+ - type: euclidean_spearman
1955
+ value: 0.5294640154371587
1956
+ - type: manhattan_pearson
1957
+ value: 0.532759930404542
1958
+ - type: manhattan_spearman
1959
+ value: 0.5317456150351015
1960
+ - task:
1961
+ type: STS
1962
+ dataset:
1963
+ type: mteb/sts22-crosslingual-sts
1964
+ name: MTEB STS22 (de-en)
1965
+ metrics:
1966
+ - type: cos_sim_pearson
1967
+ value: 0.5115115853012214
1968
+ - type: cos_sim_spearman
1969
+ value: 0.5392692508173665
1970
+ - type: euclidean_pearson
1971
+ value: 0.4455629287737235
1972
+ - type: euclidean_spearman
1973
+ value: 0.46222372143731383
1974
+ - type: manhattan_pearson
1975
+ value: 0.42831322151459006
1976
+ - type: manhattan_spearman
1977
+ value: 0.4570991764985799
1978
+ - task:
1979
+ type: STS
1980
+ dataset:
1981
+ type: mteb/sts22-crosslingual-sts
1982
+ name: MTEB STS22 (es-en)
1983
+ metrics:
1984
+ - type: cos_sim_pearson
1985
+ value: 0.30361948851267917
1986
+ - type: cos_sim_spearman
1987
+ value: 0.32739632941633834
1988
+ - type: euclidean_pearson
1989
+ value: 0.2983135800843496
1990
+ - type: euclidean_spearman
1991
+ value: 0.3111440600132692
1992
+ - type: manhattan_pearson
1993
+ value: 0.31264502938148286
1994
+ - type: manhattan_spearman
1995
+ value: 0.33311204075347495
1996
+ - task:
1997
+ type: STS
1998
+ dataset:
1999
+ type: mteb/sts22-crosslingual-sts
2000
+ name: MTEB STS22 (it)
2001
+ metrics:
2002
+ - type: cos_sim_pearson
2003
+ value: 0.3523883630335275
2004
+ - type: cos_sim_spearman
2005
+ value: 0.33677970820867037
2006
+ - type: euclidean_pearson
2007
+ value: 0.34878640693874546
2008
+ - type: euclidean_spearman
2009
+ value: 0.33525189235133496
2010
+ - type: manhattan_pearson
2011
+ value: 0.3422761246389947
2012
+ - type: manhattan_spearman
2013
+ value: 0.32713218497609176
2014
+ - task:
2015
+ type: STS
2016
+ dataset:
2017
+ type: mteb/sts22-crosslingual-sts
2018
+ name: MTEB STS22 (pl-en)
2019
+ metrics:
2020
+ - type: cos_sim_pearson
2021
+ value: 0.19809302548119545
2022
+ - type: cos_sim_spearman
2023
+ value: 0.205403702021155
2024
+ - type: euclidean_pearson
2025
+ value: 0.23006803962133016
2026
+ - type: euclidean_spearman
2027
+ value: 0.2296270653079511
2028
+ - type: manhattan_pearson
2029
+ value: 0.2540168317585851
2030
+ - type: manhattan_spearman
2031
+ value: 0.25421508137540866
2032
+ - task:
2033
+ type: STS
2034
+ dataset:
2035
+ type: mteb/sts22-crosslingual-sts
2036
+ name: MTEB STS22 (zh-en)
2037
+ metrics:
2038
+ - type: cos_sim_pearson
2039
+ value: 0.20393500955410487
2040
+ - type: cos_sim_spearman
2041
+ value: 0.267057136930116
2042
+ - type: euclidean_pearson
2043
+ value: 0.18168376767724584
2044
+ - type: euclidean_spearman
2045
+ value: 0.19260826601517245
2046
+ - type: manhattan_pearson
2047
+ value: 0.18302619990671526
2048
+ - type: manhattan_spearman
2049
+ value: 0.194691037846159
2050
+ - task:
2051
+ type: STS
2052
+ dataset:
2053
+ type: mteb/sts22-crosslingual-sts
2054
+ name: MTEB STS22 (es-it)
2055
+ metrics:
2056
+ - type: cos_sim_pearson
2057
+ value: 0.36589199830751484
2058
+ - type: cos_sim_spearman
2059
+ value: 0.3598972209997404
2060
+ - type: euclidean_pearson
2061
+ value: 0.4104511254757421
2062
+ - type: euclidean_spearman
2063
+ value: 0.39322301680629834
2064
+ - type: manhattan_pearson
2065
+ value: 0.4136802503205308
2066
+ - type: manhattan_spearman
2067
+ value: 0.4076270030293609
2068
+ - task:
2069
+ type: STS
2070
+ dataset:
2071
+ type: mteb/sts22-crosslingual-sts
2072
+ name: MTEB STS22 (de-fr)
2073
+ metrics:
2074
+ - type: cos_sim_pearson
2075
+ value: 0.26350936227950084
2076
+ - type: cos_sim_spearman
2077
+ value: 0.25108218032460344
2078
+ - type: euclidean_pearson
2079
+ value: 0.2861681094744849
2080
+ - type: euclidean_spearman
2081
+ value: 0.2735099020394359
2082
+ - type: manhattan_pearson
2083
+ value: 0.30527977072984513
2084
+ - type: manhattan_spearman
2085
+ value: 0.2640333999064081
2086
+ - task:
2087
+ type: STS
2088
+ dataset:
2089
+ type: mteb/sts22-crosslingual-sts
2090
+ name: MTEB STS22 (de-pl)
2091
+ metrics:
2092
+ - type: cos_sim_pearson
2093
+ value: 0.20056269198600324
2094
+ - type: cos_sim_spearman
2095
+ value: 0.20939990379746756
2096
+ - type: euclidean_pearson
2097
+ value: 0.18942765438962197
2098
+ - type: euclidean_spearman
2099
+ value: 0.21709842967237447
2100
+ - type: manhattan_pearson
2101
+ value: 0.23643909798655122
2102
+ - type: manhattan_spearman
2103
+ value: 0.2358828328071473
2104
+ - task:
2105
+ type: STS
2106
+ dataset:
2107
+ type: mteb/sts22-crosslingual-sts
2108
+ name: MTEB STS22 (fr-pl)
2109
+ metrics:
2110
+ - type: cos_sim_pearson
2111
+ value: 0.19563740271419394
2112
+ - type: cos_sim_spearman
2113
+ value: 0.05634361698190111
2114
+ - type: euclidean_pearson
2115
+ value: 0.16833522619239474
2116
+ - type: euclidean_spearman
2117
+ value: 0.16903085094570333
2118
+ - type: manhattan_pearson
2119
+ value: 0.058053927126608146
2120
+ - type: manhattan_spearman
2121
+ value: 0.16903085094570333
2122
+ - task:
2123
+ type: Classification
2124
+ dataset:
2125
+ type: mteb/amazon_massive_scenario
2126
+ name: MTEB MassiveScenarioClassification (af)
2127
+ metrics:
2128
+ - type: accuracy
2129
+ value: 0.40245460659045057
2130
+ - type: f1
2131
+ value: 0.3879924050989544
2132
+ - task:
2133
+ type: Classification
2134
+ dataset:
2135
+ type: mteb/amazon_massive_scenario
2136
+ name: MTEB MassiveScenarioClassification (am)
2137
+ metrics:
2138
+ - type: accuracy
2139
+ value: 0.2568930733019502
2140
+ - type: f1
2141
+ value: 0.2548816627916271
2142
+ - task:
2143
+ type: Classification
2144
+ dataset:
2145
+ type: mteb/amazon_massive_scenario
2146
+ name: MTEB MassiveScenarioClassification (ar)
2147
+ metrics:
2148
+ - type: accuracy
2149
+ value: 0.3239744451916611
2150
+ - type: f1
2151
+ value: 0.31863029579075774
2152
+ - task:
2153
+ type: Classification
2154
+ dataset:
2155
+ type: mteb/amazon_massive_scenario
2156
+ name: MTEB MassiveScenarioClassification (az)
2157
+ metrics:
2158
+ - type: accuracy
2159
+ value: 0.4053127101546738
2160
+ - type: f1
2161
+ value: 0.39707079033948933
2162
+ - task:
2163
+ type: Classification
2164
+ dataset:
2165
+ type: mteb/amazon_massive_scenario
2166
+ name: MTEB MassiveScenarioClassification (bn)
2167
+ metrics:
2168
+ - type: accuracy
2169
+ value: 0.2723268325487559
2170
+ - type: f1
2171
+ value: 0.2644365328185879
2172
+ - task:
2173
+ type: Classification
2174
+ dataset:
2175
+ type: mteb/amazon_massive_scenario
2176
+ name: MTEB MassiveScenarioClassification (cy)
2177
+ metrics:
2178
+ - type: accuracy
2179
+ value: 0.3869872225958305
2180
+ - type: f1
2181
+ value: 0.3655930387892567
2182
+ - task:
2183
+ type: Classification
2184
+ dataset:
2185
+ type: mteb/amazon_massive_scenario
2186
+ name: MTEB MassiveScenarioClassification (da)
2187
+ metrics:
2188
+ - type: accuracy
2189
+ value: 0.4475453934095494
2190
+ - type: f1
2191
+ value: 0.4287356484024154
2192
+ - task:
2193
+ type: Classification
2194
+ dataset:
2195
+ type: mteb/amazon_massive_scenario
2196
+ name: MTEB MassiveScenarioClassification (de)
2197
+ metrics:
2198
+ - type: accuracy
2199
+ value: 0.41355077336919976
2200
+ - type: f1
2201
+ value: 0.3982365179458047
2202
+ - task:
2203
+ type: Classification
2204
+ dataset:
2205
+ type: mteb/amazon_massive_scenario
2206
+ name: MTEB MassiveScenarioClassification (el)
2207
+ metrics:
2208
+ - type: accuracy
2209
+ value: 0.3843981170141224
2210
+ - type: f1
2211
+ value: 0.3702538368296387
2212
+ - task:
2213
+ type: Classification
2214
+ dataset:
2215
+ type: mteb/amazon_massive_scenario
2216
+ name: MTEB MassiveScenarioClassification (en)
2217
+ metrics:
2218
+ - type: accuracy
2219
+ value: 0.6633826496301277
2220
+ - type: f1
2221
+ value: 0.6589634765029931
2222
+ - task:
2223
+ type: Classification
2224
+ dataset:
2225
+ type: mteb/amazon_massive_scenario
2226
+ name: MTEB MassiveScenarioClassification (es)
2227
+ metrics:
2228
+ - type: accuracy
2229
+ value: 0.4417955615332885
2230
+ - type: f1
2231
+ value: 0.4310228811620319
2232
+ - task:
2233
+ type: Classification
2234
+ dataset:
2235
+ type: mteb/amazon_massive_scenario
2236
+ name: MTEB MassiveScenarioClassification (fa)
2237
+ metrics:
2238
+ - type: accuracy
2239
+ value: 0.3482851378614661
2240
+ - type: f1
2241
+ value: 0.33959524415028025
2242
+ - task:
2243
+ type: Classification
2244
+ dataset:
2245
+ type: mteb/amazon_massive_scenario
2246
+ name: MTEB MassiveScenarioClassification (fi)
2247
+ metrics:
2248
+ - type: accuracy
2249
+ value: 0.40561533288500334
2250
+ - type: f1
2251
+ value: 0.38049390117336274
2252
+ - task:
2253
+ type: Classification
2254
+ dataset:
2255
+ type: mteb/amazon_massive_scenario
2256
+ name: MTEB MassiveScenarioClassification (fr)
2257
+ metrics:
2258
+ - type: accuracy
2259
+ value: 0.45917955615332884
2260
+ - type: f1
2261
+ value: 0.4465741971572902
2262
+ - task:
2263
+ type: Classification
2264
+ dataset:
2265
+ type: mteb/amazon_massive_scenario
2266
+ name: MTEB MassiveScenarioClassification (he)
2267
+ metrics:
2268
+ - type: accuracy
2269
+ value: 0.3208473436449227
2270
+ - type: f1
2271
+ value: 0.2953932929808133
2272
+ - task:
2273
+ type: Classification
2274
+ dataset:
2275
+ type: mteb/amazon_massive_scenario
2276
+ name: MTEB MassiveScenarioClassification (hi)
2277
+ metrics:
2278
+ - type: accuracy
2279
+ value: 0.28369199731002015
2280
+ - type: f1
2281
+ value: 0.2752902837981212
2282
+ - task:
2283
+ type: Classification
2284
+ dataset:
2285
+ type: mteb/amazon_massive_scenario
2286
+ name: MTEB MassiveScenarioClassification (hu)
2287
+ metrics:
2288
+ - type: accuracy
2289
+ value: 0.3949226630800269
2290
+ - type: f1
2291
+ value: 0.37327234047050406
2292
+ - task:
2293
+ type: Classification
2294
+ dataset:
2295
+ type: mteb/amazon_massive_scenario
2296
+ name: MTEB MassiveScenarioClassification (hy)
2297
+ metrics:
2298
+ - type: accuracy
2299
+ value: 0.2590450571620713
2300
+ - type: f1
2301
+ value: 0.24547396574853445
2302
+ - task:
2303
+ type: Classification
2304
+ dataset:
2305
+ type: mteb/amazon_massive_scenario
2306
+ name: MTEB MassiveScenarioClassification (id)
2307
+ metrics:
2308
+ - type: accuracy
2309
+ value: 0.4095830531271016
2310
+ - type: f1
2311
+ value: 0.40177843177422223
2312
+ - task:
2313
+ type: Classification
2314
+ dataset:
2315
+ type: mteb/amazon_massive_scenario
2316
+ name: MTEB MassiveScenarioClassification (is)
2317
+ metrics:
2318
+ - type: accuracy
2319
+ value: 0.38564223268325487
2320
+ - type: f1
2321
+ value: 0.3735307758495248
2322
+ - task:
2323
+ type: Classification
2324
+ dataset:
2325
+ type: mteb/amazon_massive_scenario
2326
+ name: MTEB MassiveScenarioClassification (it)
2327
+ metrics:
2328
+ - type: accuracy
2329
+ value: 0.4658708809683928
2330
+ - type: f1
2331
+ value: 0.44103900526804984
2332
+ - task:
2333
+ type: Classification
2334
+ dataset:
2335
+ type: mteb/amazon_massive_scenario
2336
+ name: MTEB MassiveScenarioClassification (ja)
2337
+ metrics:
2338
+ - type: accuracy
2339
+ value: 0.4624747814391393
2340
+ - type: f1
2341
+ value: 0.454107101796664
2342
+ - task:
2343
+ type: Classification
2344
+ dataset:
2345
+ type: mteb/amazon_massive_scenario
2346
+ name: MTEB MassiveScenarioClassification (jv)
2347
+ metrics:
2348
+ - type: accuracy
2349
+ value: 0.396570275722932
2350
+ - type: f1
2351
+ value: 0.3882737576832412
2352
+ - task:
2353
+ type: Classification
2354
+ dataset:
2355
+ type: mteb/amazon_massive_scenario
2356
+ name: MTEB MassiveScenarioClassification (ka)
2357
+ metrics:
2358
+ - type: accuracy
2359
+ value: 0.2527908540685945
2360
+ - type: f1
2361
+ value: 0.23662661686788491
2362
+ - task:
2363
+ type: Classification
2364
+ dataset:
2365
+ type: mteb/amazon_massive_scenario
2366
+ name: MTEB MassiveScenarioClassification (km)
2367
+ metrics:
2368
+ - type: accuracy
2369
+ value: 0.2897108271687962
2370
+ - type: f1
2371
+ value: 0.27195758324189245
2372
+ - task:
2373
+ type: Classification
2374
+ dataset:
2375
+ type: mteb/amazon_massive_scenario
2376
+ name: MTEB MassiveScenarioClassification (kn)
2377
+ metrics:
2378
+ - type: accuracy
2379
+ value: 0.1927370544720915
2380
+ - type: f1
2381
+ value: 0.18694271924323635
2382
+ - task:
2383
+ type: Classification
2384
+ dataset:
2385
+ type: mteb/amazon_massive_scenario
2386
+ name: MTEB MassiveScenarioClassification (ko)
2387
+ metrics:
2388
+ - type: accuracy
2389
+ value: 0.3572965702757229
2390
+ - type: f1
2391
+ value: 0.3438287006177308
2392
+ - task:
2393
+ type: Classification
2394
+ dataset:
2395
+ type: mteb/amazon_massive_scenario
2396
+ name: MTEB MassiveScenarioClassification (lv)
2397
+ metrics:
2398
+ - type: accuracy
2399
+ value: 0.3957296570275723
2400
+ - type: f1
2401
+ value: 0.38074945140886923
2402
+ - task:
2403
+ type: Classification
2404
+ dataset:
2405
+ type: mteb/amazon_massive_scenario
2406
+ name: MTEB MassiveScenarioClassification (ml)
2407
+ metrics:
2408
+ - type: accuracy
2409
+ value: 0.19895763281775386
2410
+ - type: f1
2411
+ value: 0.20009313648468288
2412
+ - task:
2413
+ type: Classification
2414
+ dataset:
2415
+ type: mteb/amazon_massive_scenario
2416
+ name: MTEB MassiveScenarioClassification (mn)
2417
+ metrics:
2418
+ - type: accuracy
2419
+ value: 0.32431069266980495
2420
+ - type: f1
2421
+ value: 0.31395958664782575
2422
+ - task:
2423
+ type: Classification
2424
+ dataset:
2425
+ type: mteb/amazon_massive_scenario
2426
+ name: MTEB MassiveScenarioClassification (ms)
2427
+ metrics:
2428
+ - type: accuracy
2429
+ value: 0.42323470073974445
2430
+ - type: f1
2431
+ value: 0.4081374026314701
2432
+ - task:
2433
+ type: Classification
2434
+ dataset:
2435
+ type: mteb/amazon_massive_scenario
2436
+ name: MTEB MassiveScenarioClassification (my)
2437
+ metrics:
2438
+ - type: accuracy
2439
+ value: 0.20864156018829857
2440
+ - type: f1
2441
+ value: 0.20409870408935435
2442
+ - task:
2443
+ type: Classification
2444
+ dataset:
2445
+ type: mteb/amazon_massive_scenario
2446
+ name: MTEB MassiveScenarioClassification (nb)
2447
+ metrics:
2448
+ - type: accuracy
2449
+ value: 0.4047074646940148
2450
+ - type: f1
2451
+ value: 0.3919044149415904
2452
+ - task:
2453
+ type: Classification
2454
+ dataset:
2455
+ type: mteb/amazon_massive_scenario
2456
+ name: MTEB MassiveScenarioClassification (nl)
2457
+ metrics:
2458
+ - type: accuracy
2459
+ value: 0.43591123066577
2460
+ - type: f1
2461
+ value: 0.4143420363064241
2462
+ - task:
2463
+ type: Classification
2464
+ dataset:
2465
+ type: mteb/amazon_massive_scenario
2466
+ name: MTEB MassiveScenarioClassification (pl)
2467
+ metrics:
2468
+ - type: accuracy
2469
+ value: 0.41876260928043046
2470
+ - type: f1
2471
+ value: 0.4119211767666761
2472
+ - task:
2473
+ type: Classification
2474
+ dataset:
2475
+ type: mteb/amazon_massive_scenario
2476
+ name: MTEB MassiveScenarioClassification (pt)
2477
+ metrics:
2478
+ - type: accuracy
2479
+ value: 0.46308002689979827
2480
+ - type: f1
2481
+ value: 0.4525536730126799
2482
+ - task:
2483
+ type: Classification
2484
+ dataset:
2485
+ type: mteb/amazon_massive_scenario
2486
+ name: MTEB MassiveScenarioClassification (ro)
2487
+ metrics:
2488
+ - type: accuracy
2489
+ value: 0.4252521856086079
2490
+ - type: f1
2491
+ value: 0.4102418109296485
2492
+ - task:
2493
+ type: Classification
2494
+ dataset:
2495
+ type: mteb/amazon_massive_scenario
2496
+ name: MTEB MassiveScenarioClassification (ru)
2497
+ metrics:
2498
+ - type: accuracy
2499
+ value: 0.3594821788836584
2500
+ - type: f1
2501
+ value: 0.3508598314806566
2502
+ - task:
2503
+ type: Classification
2504
+ dataset:
2505
+ type: mteb/amazon_massive_scenario
2506
+ name: MTEB MassiveScenarioClassification (sl)
2507
+ metrics:
2508
+ - type: accuracy
2509
+ value: 0.3869199731002017
2510
+ - type: f1
2511
+ value: 0.3768119408674127
2512
+ - task:
2513
+ type: Classification
2514
+ dataset:
2515
+ type: mteb/amazon_massive_scenario
2516
+ name: MTEB MassiveScenarioClassification (sq)
2517
+ metrics:
2518
+ - type: accuracy
2519
+ value: 0.4047410894418292
2520
+ - type: f1
2521
+ value: 0.39480530387013596
2522
+ - task:
2523
+ type: Classification
2524
+ dataset:
2525
+ type: mteb/amazon_massive_scenario
2526
+ name: MTEB MassiveScenarioClassification (sv)
2527
+ metrics:
2528
+ - type: accuracy
2529
+ value: 0.41523201075991933
2530
+ - type: f1
2531
+ value: 0.40200979960243827
2532
+ - task:
2533
+ type: Classification
2534
+ dataset:
2535
+ type: mteb/amazon_massive_scenario
2536
+ name: MTEB MassiveScenarioClassification (sw)
2537
+ metrics:
2538
+ - type: accuracy
2539
+ value: 0.39549428379287155
2540
+ - type: f1
2541
+ value: 0.3818556124333806
2542
+ - task:
2543
+ type: Classification
2544
+ dataset:
2545
+ type: mteb/amazon_massive_scenario
2546
+ name: MTEB MassiveScenarioClassification (ta)
2547
+ metrics:
2548
+ - type: accuracy
2549
+ value: 0.228782784129119
2550
+ - type: f1
2551
+ value: 0.22239467186721457
2552
+ - task:
2553
+ type: Classification
2554
+ dataset:
2555
+ type: mteb/amazon_massive_scenario
2556
+ name: MTEB MassiveScenarioClassification (te)
2557
+ metrics:
2558
+ - type: accuracy
2559
+ value: 0.2051445864156019
2560
+ - type: f1
2561
+ value: 0.1999904788553022
2562
+ - task:
2563
+ type: Classification
2564
+ dataset:
2565
+ type: mteb/amazon_massive_scenario
2566
+ name: MTEB MassiveScenarioClassification (th)
2567
+ metrics:
2568
+ - type: accuracy
2569
+ value: 0.34926025554808343
2570
+ - type: f1
2571
+ value: 0.33240167172157226
2572
+ - task:
2573
+ type: Classification
2574
+ dataset:
2575
+ type: mteb/amazon_massive_scenario
2576
+ name: MTEB MassiveScenarioClassification (tl)
2577
+ metrics:
2578
+ - type: accuracy
2579
+ value: 0.4074983187626093
2580
+ - type: f1
2581
+ value: 0.3930274328728882
2582
+ - task:
2583
+ type: Classification
2584
+ dataset:
2585
+ type: mteb/amazon_massive_scenario
2586
+ name: MTEB MassiveScenarioClassification (tr)
2587
+ metrics:
2588
+ - type: accuracy
2589
+ value: 0.3906859448554136
2590
+ - type: f1
2591
+ value: 0.39215420396629713
2592
+ - task:
2593
+ type: Classification
2594
+ dataset:
2595
+ type: mteb/amazon_massive_scenario
2596
+ name: MTEB MassiveScenarioClassification (ur)
2597
+ metrics:
2598
+ - type: accuracy
2599
+ value: 0.29747814391392063
2600
+ - type: f1
2601
+ value: 0.2826183689222045
2602
+ - task:
2603
+ type: Classification
2604
+ dataset:
2605
+ type: mteb/amazon_massive_scenario
2606
+ name: MTEB MassiveScenarioClassification (vi)
2607
+ metrics:
2608
+ - type: accuracy
2609
+ value: 0.3802286482851379
2610
+ - type: f1
2611
+ value: 0.37874243860869694
2612
+ - task:
2613
+ type: Classification
2614
+ dataset:
2615
+ type: mteb/amazon_massive_scenario
2616
+ name: MTEB MassiveScenarioClassification (zh-CN)
2617
+ metrics:
2618
+ - type: accuracy
2619
+ value: 0.48550773369199723
2620
+ - type: f1
2621
+ value: 0.46739962588264905
2622
+ - task:
2623
+ type: Classification
2624
+ dataset:
2625
+ type: mteb/amazon_massive_scenario
2626
+ name: MTEB MassiveScenarioClassification (zh-TW)
2627
+ metrics:
2628
+ - type: accuracy
2629
+ value: 0.45178211163416276
2630
+ - type: f1
2631
+ value: 0.4484809741811729
2632
+ - task:
2633
+ type: Retrieval
2634
+ dataset:
2635
+ type: quora
2636
+ name: MTEB QuoraRetrieval
2637
+ metrics:
2638
+ - type: map_at_1
2639
+ value: 0.61697
2640
+ - type: map_at_10
2641
+ value: 0.74204
2642
+ - type: map_at_100
2643
+ value: 0.75023
2644
+ - type: map_at_1000
2645
+ value: 0.75059
2646
+ - type: map_at_3
2647
+ value: 0.71265
2648
+ - type: map_at_5
2649
+ value: 0.73001
2650
+ - type: ndcg_at_1
2651
+ value: 0.7095
2652
+ - type: ndcg_at_10
2653
+ value: 0.7896
2654
+ - type: ndcg_at_100
2655
+ value: 0.8126
2656
+ - type: ndcg_at_1000
2657
+ value: 0.81679
2658
+ - type: ndcg_at_3
2659
+ value: 0.75246
2660
+ - type: ndcg_at_5
2661
+ value: 0.77092
2662
+ - type: precision_at_1
2663
+ value: 0.7095
2664
+ - type: precision_at_10
2665
+ value: 0.11998
2666
+ - type: precision_at_100
2667
+ value: 0.01451
2668
+ - type: precision_at_1000
2669
+ value: 0.00154
2670
+ - type: precision_at_3
2671
+ value: 0.3263
2672
+ - type: precision_at_5
2673
+ value: 0.21574
2674
+ - type: recall_at_1
2675
+ value: 0.61697
2676
+ - type: recall_at_10
2677
+ value: 0.88233
2678
+ - type: recall_at_100
2679
+ value: 0.96961
2680
+ - type: recall_at_1000
2681
+ value: 0.99401
2682
+ - type: recall_at_3
2683
+ value: 0.77689
2684
+ - type: recall_at_5
2685
+ value: 0.82745
2686
+ - task:
2687
+ type: STS
2688
+ dataset:
2689
+ type: mteb/sickr-sts
2690
+ name: MTEB SICK-R
2691
+ metrics:
2692
+ - type: cos_sim_pearson
2693
+ value: 0.8096286245858941
2694
+ - type: cos_sim_spearman
2695
+ value: 0.7457093488947429
2696
+ - type: euclidean_pearson
2697
+ value: 0.7550377970259401
2698
+ - type: euclidean_spearman
2699
+ value: 0.7174980046229991
2700
+ - type: manhattan_pearson
2701
+ value: 0.7532568360913819
2702
+ - type: manhattan_spearman
2703
+ value: 0.7180676733410375
2704
+ - task:
2705
+ type: PairClassification
2706
+ dataset:
2707
+ type: mteb/twitterurlcorpus-pairclassification
2708
+ name: MTEB TwitterURLCorpus
2709
+ metrics:
2710
+ - type: cos_sim_accuracy
2711
+ value: 0.8663018589668956
2712
+ - type: cos_sim_accuracy_threshold
2713
+ value: 0.6738145351409912
2714
+ - type: cos_sim_ap
2715
+ value: 0.805106377126291
2716
+ - type: cos_sim_f1
2717
+ value: 0.7270810586950793
2718
+ - type: cos_sim_f1_threshold
2719
+ value: 0.6406128406524658
2720
+ - type: cos_sim_precision
2721
+ value: 0.7114123627790466
2722
+ - type: cos_sim_recall
2723
+ value: 0.743455497382199
2724
+ - type: dot_accuracy
2725
+ value: 0.8241743315092949
2726
+ - type: dot_accuracy_threshold
2727
+ value: 967.1823120117188
2728
+ - type: dot_ap
2729
+ value: 0.692393381283664
2730
+ - type: dot_f1
2731
+ value: 0.6561346624814597
2732
+ - type: dot_f1_threshold
2733
+ value: 831.1060791015625
2734
+ - type: dot_precision
2735
+ value: 0.5943260638630257
2736
+ - type: dot_recall
2737
+ value: 0.7322913458577148
2738
+ - type: euclidean_accuracy
2739
+ value: 0.8649435324251951
2740
+ - type: euclidean_accuracy_threshold
2741
+ value: 30.077878952026367
2742
+ - type: euclidean_ap
2743
+ value: 0.8028100477250927
2744
+ - type: euclidean_f1
2745
+ value: 0.7258242344489099
2746
+ - type: euclidean_f1_threshold
2747
+ value: 32.570228576660156
2748
+ - type: euclidean_precision
2749
+ value: 0.6744662568576906
2750
+ - type: euclidean_recall
2751
+ value: 0.7856482907299045
2752
+ - type: manhattan_accuracy
2753
+ value: 0.8659525749990298
2754
+ - type: manhattan_accuracy_threshold
2755
+ value: 625.0921020507812
2756
+ - type: manhattan_ap
2757
+ value: 0.8037850832566262
2758
+ - type: manhattan_f1
2759
+ value: 0.7259435321233073
2760
+ - type: manhattan_f1_threshold
2761
+ value: 679.8679809570312
2762
+ - type: manhattan_precision
2763
+ value: 0.6819350473612991
2764
+ - type: manhattan_recall
2765
+ value: 0.7760240221743148
2766
+ - type: max_accuracy
2767
+ value: 0.8663018589668956
2768
+ - type: max_ap
2769
+ value: 0.805106377126291
2770
+ - type: max_f1
2771
+ value: 0.7270810586950793
2772
+ - task:
2773
+ type: Clustering
2774
+ dataset:
2775
+ type: mteb/biorxiv-clustering-s2s
2776
+ name: MTEB BiorxivClusteringS2S
2777
+ metrics:
2778
+ - type: v_measure
2779
+ value: 0.23080939123955474
2780
+ - task:
2781
+ type: STS
2782
+ dataset:
2783
+ type: mteb/sts17-crosslingual-sts
2784
+ name: MTEB STS17 (ko-ko)
2785
+ metrics:
2786
+ - type: cos_sim_pearson
2787
+ value: 0.430464619152799
2788
+ - type: cos_sim_spearman
2789
+ value: 0.4565606588928089
2790
+ - type: euclidean_pearson
2791
+ value: 0.45694377883554993
2792
+ - type: euclidean_spearman
2793
+ value: 0.4508552742346606
2794
+ - type: manhattan_pearson
2795
+ value: 0.45871666989036813
2796
+ - type: manhattan_spearman
2797
+ value: 0.45155963016434164
2798
+ - task:
2799
+ type: STS
2800
+ dataset:
2801
+ type: mteb/sts17-crosslingual-sts
2802
+ name: MTEB STS17 (ar-ar)
2803
+ metrics:
2804
+ - type: cos_sim_pearson
2805
+ value: 0.5327469278912148
2806
+ - type: cos_sim_spearman
2807
+ value: 0.541611320762379
2808
+ - type: euclidean_pearson
2809
+ value: 0.5597026429327157
2810
+ - type: euclidean_spearman
2811
+ value: 0.5471320909074608
2812
+ - type: manhattan_pearson
2813
+ value: 0.5612511774278802
2814
+ - type: manhattan_spearman
2815
+ value: 0.5522875659158676
2816
+ - task:
2817
+ type: STS
2818
+ dataset:
2819
+ type: mteb/sts17-crosslingual-sts
2820
+ name: MTEB STS17 (en-ar)
2821
+ metrics:
2822
+ - type: cos_sim_pearson
2823
+ value: 0.015482997790039945
2824
+ - type: cos_sim_spearman
2825
+ value: 0.01720838634736358
2826
+ - type: euclidean_pearson
2827
+ value: -0.06727915670345885
2828
+ - type: euclidean_spearman
2829
+ value: -0.06112826908474543
2830
+ - type: manhattan_pearson
2831
+ value: -0.0494386093060865
2832
+ - type: manhattan_spearman
2833
+ value: -0.05018174110623732
2834
+ - task:
2835
+ type: STS
2836
+ dataset:
2837
+ type: mteb/sts17-crosslingual-sts
2838
+ name: MTEB STS17 (en-de)
2839
+ metrics:
2840
+ - type: cos_sim_pearson
2841
+ value: 0.275420218362265
2842
+ - type: cos_sim_spearman
2843
+ value: 0.2548383843103101
2844
+ - type: euclidean_pearson
2845
+ value: 0.06268684143856358
2846
+ - type: euclidean_spearman
2847
+ value: 0.058779614210916785
2848
+ - type: manhattan_pearson
2849
+ value: 0.026672377392278606
2850
+ - type: manhattan_spearman
2851
+ value: 0.025683839956554773
2852
+ - task:
2853
+ type: STS
2854
+ dataset:
2855
+ type: mteb/sts17-crosslingual-sts
2856
+ name: MTEB STS17 (en-en)
2857
+ metrics:
2858
+ - type: cos_sim_pearson
2859
+ value: 0.8532029757646663
2860
+ - type: cos_sim_spearman
2861
+ value: 0.8732720847297224
2862
+ - type: euclidean_pearson
2863
+ value: 0.8112594485791255
2864
+ - type: euclidean_spearman
2865
+ value: 0.811531079489332
2866
+ - type: manhattan_pearson
2867
+ value: 0.8132899414704019
2868
+ - type: manhattan_spearman
2869
+ value: 0.813897040261192
2870
+ - task:
2871
+ type: STS
2872
+ dataset:
2873
+ type: mteb/sts17-crosslingual-sts
2874
+ name: MTEB STS17 (en-tr)
2875
+ metrics:
2876
+ - type: cos_sim_pearson
2877
+ value: 0.0437162299241808
2878
+ - type: cos_sim_spearman
2879
+ value: 0.020879072561774542
2880
+ - type: euclidean_pearson
2881
+ value: -0.030725243785454597
2882
+ - type: euclidean_spearman
2883
+ value: -0.05372133927948353
2884
+ - type: manhattan_pearson
2885
+ value: -0.04867795293367359
2886
+ - type: manhattan_spearman
2887
+ value: -0.07939706984001878
2888
+ - task:
2889
+ type: STS
2890
+ dataset:
2891
+ type: mteb/sts17-crosslingual-sts
2892
+ name: MTEB STS17 (es-en)
2893
+ metrics:
2894
+ - type: cos_sim_pearson
2895
+ value: 0.20306030448858603
2896
+ - type: cos_sim_spearman
2897
+ value: 0.2193220782551375
2898
+ - type: euclidean_pearson
2899
+ value: 0.03878631934602361
2900
+ - type: euclidean_spearman
2901
+ value: 0.05171796902725965
2902
+ - type: manhattan_pearson
2903
+ value: 0.0713020644036815
2904
+ - type: manhattan_spearman
2905
+ value: 0.07707315591498748
2906
+ - task:
2907
+ type: STS
2908
+ dataset:
2909
+ type: mteb/sts17-crosslingual-sts
2910
+ name: MTEB STS17 (es-es)
2911
+ metrics:
2912
+ - type: cos_sim_pearson
2913
+ value: 0.6681873207478459
2914
+ - type: cos_sim_spearman
2915
+ value: 0.6780273445636502
2916
+ - type: euclidean_pearson
2917
+ value: 0.7060654682977268
2918
+ - type: euclidean_spearman
2919
+ value: 0.694566208379486
2920
+ - type: manhattan_pearson
2921
+ value: 0.7095484618966419
2922
+ - type: manhattan_spearman
2923
+ value: 0.6978323323058773
2924
+ - task:
2925
+ type: STS
2926
+ dataset:
2927
+ type: mteb/sts17-crosslingual-sts
2928
+ name: MTEB STS17 (fr-en)
2929
+ metrics:
2930
+ - type: cos_sim_pearson
2931
+ value: 0.21366487281202604
2932
+ - type: cos_sim_spearman
2933
+ value: 0.18906275286984808
2934
+ - type: euclidean_pearson
2935
+ value: -0.023390998579461995
2936
+ - type: euclidean_spearman
2937
+ value: -0.04151213674012541
2938
+ - type: manhattan_pearson
2939
+ value: -0.02234831868844863
2940
+ - type: manhattan_spearman
2941
+ value: -0.045552913285014415
2942
+ - task:
2943
+ type: STS
2944
+ dataset:
2945
+ type: mteb/sts17-crosslingual-sts
2946
+ name: MTEB STS17 (it-en)
2947
+ metrics:
2948
+ - type: cos_sim_pearson
2949
+ value: 0.20731531772510847
2950
+ - type: cos_sim_spearman
2951
+ value: 0.163855949033176
2952
+ - type: euclidean_pearson
2953
+ value: -0.08734648741714238
2954
+ - type: euclidean_spearman
2955
+ value: -0.1075672244732182
2956
+ - type: manhattan_pearson
2957
+ value: -0.07536654126608877
2958
+ - type: manhattan_spearman
2959
+ value: -0.08330065460047295
2960
+ - task:
2961
+ type: STS
2962
+ dataset:
2963
+ type: mteb/sts17-crosslingual-sts
2964
+ name: MTEB STS17 (nl-en)
2965
+ metrics:
2966
+ - type: cos_sim_pearson
2967
+ value: 0.2661843502408425
2968
+ - type: cos_sim_spearman
2969
+ value: 0.23488974089577816
2970
+ - type: euclidean_pearson
2971
+ value: -0.031310350304707864
2972
+ - type: euclidean_spearman
2973
+ value: -0.031242598481634666
2974
+ - type: manhattan_pearson
2975
+ value: -0.011096752982707007
2976
+ - type: manhattan_spearman
2977
+ value: -0.014591693078765849
2978
+ - task:
2979
+ type: Retrieval
2980
+ dataset:
2981
+ type: trec-covid
2982
+ name: MTEB TRECCOVID
2983
+ metrics:
2984
+ - type: map_at_1
2985
+ value: 0.00113
2986
+ - type: map_at_10
2987
+ value: 0.00733
2988
+ - type: map_at_100
2989
+ value: 0.03313
2990
+ - type: map_at_1000
2991
+ value: 0.07355
2992
+ - type: map_at_3
2993
+ value: 0.00282
2994
+ - type: map_at_5
2995
+ value: 0.00414
2996
+ - type: ndcg_at_1
2997
+ value: 0.42
2998
+ - type: ndcg_at_10
2999
+ value: 0.3931
3000
+ - type: ndcg_at_100
3001
+ value: 0.26904
3002
+ - type: ndcg_at_1000
3003
+ value: 0.23778
3004
+ - type: ndcg_at_3
3005
+ value: 0.42776
3006
+ - type: ndcg_at_5
3007
+ value: 0.41554
3008
+ - type: precision_at_1
3009
+ value: 0.48
3010
+ - type: precision_at_10
3011
+ value: 0.43
3012
+ - type: precision_at_100
3013
+ value: 0.2708
3014
+ - type: precision_at_1000
3015
+ value: 0.11014
3016
+ - type: precision_at_3
3017
+ value: 0.48
3018
+ - type: precision_at_5
3019
+ value: 0.456
3020
+ - type: recall_at_1
3021
+ value: 0.00113
3022
+ - type: recall_at_10
3023
+ value: 0.00976
3024
+ - type: recall_at_100
3025
+ value: 0.05888
3026
+ - type: recall_at_1000
3027
+ value: 0.22635
3028
+ - type: recall_at_3
3029
+ value: 0.00329
3030
+ - type: recall_at_5
3031
+ value: 0.00518
3032
+ - task:
3033
+ type: Retrieval
3034
+ dataset:
3035
+ type: scifact
3036
+ name: MTEB SciFact
3037
+ metrics:
3038
+ - type: map_at_1
3039
+ value: 0.21556
3040
+ - type: map_at_10
3041
+ value: 0.27982
3042
+ - type: map_at_100
3043
+ value: 0.28937
3044
+ - type: map_at_1000
3045
+ value: 0.29058
3046
+ - type: map_at_3
3047
+ value: 0.25644
3048
+ - type: map_at_5
3049
+ value: 0.26996
3050
+ - type: ndcg_at_1
3051
+ value: 0.23333
3052
+ - type: ndcg_at_10
3053
+ value: 0.31787
3054
+ - type: ndcg_at_100
3055
+ value: 0.36648
3056
+ - type: ndcg_at_1000
3057
+ value: 0.39936
3058
+ - type: ndcg_at_3
3059
+ value: 0.27299
3060
+ - type: ndcg_at_5
3061
+ value: 0.29659
3062
+ - type: precision_at_1
3063
+ value: 0.23333
3064
+ - type: precision_at_10
3065
+ value: 0.04867
3066
+ - type: precision_at_100
3067
+ value: 0.00743
3068
+ - type: precision_at_1000
3069
+ value: 0.00102
3070
+ - type: precision_at_3
3071
+ value: 0.11333
3072
+ - type: precision_at_5
3073
+ value: 0.08133
3074
+ - type: recall_at_1
3075
+ value: 0.21556
3076
+ - type: recall_at_10
3077
+ value: 0.42333
3078
+ - type: recall_at_100
3079
+ value: 0.65706
3080
+ - type: recall_at_1000
3081
+ value: 0.91489
3082
+ - type: recall_at_3
3083
+ value: 0.30361
3084
+ - type: recall_at_5
3085
+ value: 0.36222
3086
+ - task:
3087
+ type: Retrieval
3088
+ dataset:
3089
+ type: scidocs
3090
+ name: MTEB SCIDOCS
3091
+ metrics:
3092
+ - type: map_at_1
3093
+ value: 0.0172
3094
+ - type: map_at_10
3095
+ value: 0.03824
3096
+ - type: map_at_100
3097
+ value: 0.04727
3098
+ - type: map_at_1000
3099
+ value: 0.04932
3100
+ - type: map_at_3
3101
+ value: 0.02867
3102
+ - type: map_at_5
3103
+ value: 0.03323
3104
+ - type: ndcg_at_1
3105
+ value: 0.085
3106
+ - type: ndcg_at_10
3107
+ value: 0.07133
3108
+ - type: ndcg_at_100
3109
+ value: 0.11911
3110
+ - type: ndcg_at_1000
3111
+ value: 0.16962
3112
+ - type: ndcg_at_3
3113
+ value: 0.06763
3114
+ - type: ndcg_at_5
3115
+ value: 0.05832
3116
+ - type: precision_at_1
3117
+ value: 0.085
3118
+ - type: precision_at_10
3119
+ value: 0.0368
3120
+ - type: precision_at_100
3121
+ value: 0.01067
3122
+ - type: precision_at_1000
3123
+ value: 0.0023
3124
+ - type: precision_at_3
3125
+ value: 0.06233
3126
+ - type: precision_at_5
3127
+ value: 0.0502
3128
+ - type: recall_at_1
3129
+ value: 0.0172
3130
+ - type: recall_at_10
3131
+ value: 0.07487
3132
+ - type: recall_at_100
3133
+ value: 0.21683
3134
+ - type: recall_at_1000
3135
+ value: 0.46688
3136
+ - type: recall_at_3
3137
+ value: 0.03798
3138
+ - type: recall_at_5
3139
+ value: 0.05113
3140
+ - task:
3141
+ type: Retrieval
3142
+ dataset:
3143
+ type: nq
3144
+ name: MTEB NQ
3145
+ metrics:
3146
+ - type: map_at_1
3147
+ value: 0.03515
3148
+ - type: map_at_10
3149
+ value: 0.05884
3150
+ - type: map_at_100
3151
+ value: 0.0651
3152
+ - type: map_at_1000
3153
+ value: 0.06599
3154
+ - type: map_at_3
3155
+ value: 0.04892
3156
+ - type: map_at_5
3157
+ value: 0.05391
3158
+ - type: ndcg_at_1
3159
+ value: 0.04056
3160
+ - type: ndcg_at_10
3161
+ value: 0.07626
3162
+ - type: ndcg_at_100
3163
+ value: 0.1108
3164
+ - type: ndcg_at_1000
3165
+ value: 0.13793
3166
+ - type: ndcg_at_3
3167
+ value: 0.05537
3168
+ - type: ndcg_at_5
3169
+ value: 0.0645
3170
+ - type: precision_at_1
3171
+ value: 0.04056
3172
+ - type: precision_at_10
3173
+ value: 0.01457
3174
+ - type: precision_at_100
3175
+ value: 0.00347
3176
+ - type: precision_at_1000
3177
+ value: 0.00061
3178
+ - type: precision_at_3
3179
+ value: 0.02607
3180
+ - type: precision_at_5
3181
+ value: 0.02086
3182
+ - type: recall_at_1
3183
+ value: 0.03515
3184
+ - type: recall_at_10
3185
+ value: 0.12312
3186
+ - type: recall_at_100
3187
+ value: 0.28713
3188
+ - type: recall_at_1000
3189
+ value: 0.50027
3190
+ - type: recall_at_3
3191
+ value: 0.06701
3192
+ - type: recall_at_5
3193
+ value: 0.08816
3194
+ - task:
3195
+ type: STS
3196
+ dataset:
3197
+ type: mteb/sts16-sts
3198
+ name: MTEB STS16
3199
+ metrics:
3200
+ - type: cos_sim_pearson
3201
+ value: 0.7604750373932828
3202
+ - type: cos_sim_spearman
3203
+ value: 0.7793230986462234
3204
+ - type: euclidean_pearson
3205
+ value: 0.758320302521164
3206
+ - type: euclidean_spearman
3207
+ value: 0.7683154481579385
3208
+ - type: manhattan_pearson
3209
+ value: 0.7598713517720608
3210
+ - type: manhattan_spearman
3211
+ value: 0.7695479705521506
3212
+ - task:
3213
+ type: Classification
3214
+ dataset:
3215
+ type: mteb/emotion
3216
+ name: MTEB EmotionClassification
3217
+ metrics:
3218
+ - type: accuracy
3219
+ value: 0.42225
3220
+ - type: f1
3221
+ value: 0.3756351654211211
3222
+ - task:
3223
+ type: Retrieval
3224
+ dataset:
3225
+ type: BeIR/cqadupstack
3226
+ name: MTEB CQADupstackWebmastersRetrieval
3227
+ metrics:
3228
+ - type: map_at_1
3229
+ value: 0.13757
3230
+ - type: map_at_10
3231
+ value: 0.1927
3232
+ - type: map_at_100
3233
+ value: 0.20461
3234
+ - type: map_at_1000
3235
+ value: 0.20641
3236
+ - type: map_at_3
3237
+ value: 0.17865
3238
+ - type: map_at_5
3239
+ value: 0.18618
3240
+ - type: ndcg_at_1
3241
+ value: 0.16996
3242
+ - type: ndcg_at_10
3243
+ value: 0.22774
3244
+ - type: ndcg_at_100
3245
+ value: 0.27675
3246
+ - type: ndcg_at_1000
3247
+ value: 0.31145
3248
+ - type: ndcg_at_3
3249
+ value: 0.20691
3250
+ - type: ndcg_at_5
3251
+ value: 0.21741
3252
+ - type: precision_at_1
3253
+ value: 0.16996
3254
+ - type: precision_at_10
3255
+ value: 0.04545
3256
+ - type: precision_at_100
3257
+ value: 0.01036
3258
+ - type: precision_at_1000
3259
+ value: 0.00185
3260
+ - type: precision_at_3
3261
+ value: 0.10145
3262
+ - type: precision_at_5
3263
+ value: 0.07391
3264
+ - type: recall_at_1
3265
+ value: 0.13757
3266
+ - type: recall_at_10
3267
+ value: 0.28234
3268
+ - type: recall_at_100
3269
+ value: 0.51055
3270
+ - type: recall_at_1000
3271
+ value: 0.75353
3272
+ - type: recall_at_3
3273
+ value: 0.21794
3274
+ - type: recall_at_5
3275
+ value: 0.24614
3276
+ - task:
3277
+ type: Clustering
3278
+ dataset:
3279
+ type: mteb/reddit-clustering-p2p
3280
+ name: MTEB RedditClusteringP2P
3281
+ metrics:
3282
+ - type: v_measure
3283
+ value: 0.41007999100992665
3284
+ - task:
3285
+ type: Retrieval
3286
+ dataset:
3287
+ type: BeIR/cqadupstack
3288
+ name: MTEB CQADupstackGisRetrieval
3289
+ metrics:
3290
+ - type: map_at_1
3291
+ value: 0.11351
3292
+ - type: map_at_10
3293
+ value: 0.14953
3294
+ - type: map_at_100
3295
+ value: 0.15623
3296
+ - type: map_at_1000
3297
+ value: 0.15716
3298
+ - type: map_at_3
3299
+ value: 0.13603
3300
+ - type: map_at_5
3301
+ value: 0.14343
3302
+ - type: ndcg_at_1
3303
+ value: 0.12429
3304
+ - type: ndcg_at_10
3305
+ value: 0.17319
3306
+ - type: ndcg_at_100
3307
+ value: 0.2099
3308
+ - type: ndcg_at_1000
3309
+ value: 0.23899
3310
+ - type: ndcg_at_3
3311
+ value: 0.14605
3312
+ - type: ndcg_at_5
3313
+ value: 0.1589
3314
+ - type: precision_at_1
3315
+ value: 0.12429
3316
+ - type: precision_at_10
3317
+ value: 0.02701
3318
+ - type: precision_at_100
3319
+ value: 0.00487
3320
+ - type: precision_at_1000
3321
+ value: 0.00078
3322
+ - type: precision_at_3
3323
+ value: 0.06026
3324
+ - type: precision_at_5
3325
+ value: 0.04384
3326
+ - type: recall_at_1
3327
+ value: 0.11351
3328
+ - type: recall_at_10
3329
+ value: 0.23536
3330
+ - type: recall_at_100
3331
+ value: 0.40942
3332
+ - type: recall_at_1000
3333
+ value: 0.6405
3334
+ - type: recall_at_3
3335
+ value: 0.16195
3336
+ - type: recall_at_5
3337
+ value: 0.19264
3338
+ - task:
3339
+ type: STS
3340
+ dataset:
3341
+ type: mteb/stsbenchmark-sts
3342
+ name: MTEB STSBenchmark
3343
+ metrics:
3344
+ - type: cos_sim_pearson
3345
+ value: 0.8000905671833967
3346
+ - type: cos_sim_spearman
3347
+ value: 0.7954269211027273
3348
+ - type: euclidean_pearson
3349
+ value: 0.7951954544247442
3350
+ - type: euclidean_spearman
3351
+ value: 0.7893670303434288
3352
+ - type: manhattan_pearson
3353
+ value: 0.7947610653340678
3354
+ - type: manhattan_spearman
3355
+ value: 0.7907344156719612
3356
+ - task:
3357
+ type: Classification
3358
+ dataset:
3359
+ type: mteb/banking77
3360
+ name: MTEB Banking77Classification
3361
+ metrics:
3362
+ - type: accuracy
3363
+ value: 0.7467857142857142
3364
+ - type: f1
3365
+ value: 0.7461743413995573
3366
+ - task:
3367
+ type: Retrieval
3368
+ dataset:
3369
+ type: BeIR/cqadupstack
3370
+ name: MTEB CQADupstackStatsRetrieval
3371
+ metrics:
3372
+ - type: map_at_1
3373
+ value: 0.12307
3374
+ - type: map_at_10
3375
+ value: 0.1544
3376
+ - type: map_at_100
3377
+ value: 0.16033
3378
+ - type: map_at_1000
3379
+ value: 0.1614
3380
+ - type: map_at_3
3381
+ value: 0.14393
3382
+ - type: map_at_5
3383
+ value: 0.14856
3384
+ - type: ndcg_at_1
3385
+ value: 0.14571
3386
+ - type: ndcg_at_10
3387
+ value: 0.17685
3388
+ - type: ndcg_at_100
3389
+ value: 0.20882
3390
+ - type: ndcg_at_1000
3391
+ value: 0.23888
3392
+ - type: ndcg_at_3
3393
+ value: 0.15739
3394
+ - type: ndcg_at_5
3395
+ value: 0.16391
3396
+ - type: precision_at_1
3397
+ value: 0.14571
3398
+ - type: precision_at_10
3399
+ value: 0.02883
3400
+ - type: precision_at_100
3401
+ value: 0.00491
3402
+ - type: precision_at_1000
3403
+ value: 0.0008
3404
+ - type: precision_at_3
3405
+ value: 0.07004
3406
+ - type: precision_at_5
3407
+ value: 0.04693
3408
+ - type: recall_at_1
3409
+ value: 0.12307
3410
+ - type: recall_at_10
3411
+ value: 0.22566
3412
+ - type: recall_at_100
3413
+ value: 0.37469
3414
+ - type: recall_at_1000
3415
+ value: 0.6055
3416
+ - type: recall_at_3
3417
+ value: 0.16742
3418
+ - type: recall_at_5
3419
+ value: 0.18634
3420
+ - task:
3421
+ type: STS
3422
+ dataset:
3423
+ type: mteb/biosses-sts
3424
+ name: MTEB BIOSSES
3425
+ metrics:
3426
+ - type: cos_sim_pearson
3427
+ value: 0.7278000135012542
3428
+ - type: cos_sim_spearman
3429
+ value: 0.7092812216947605
3430
+ - type: euclidean_pearson
3431
+ value: 0.771169214949292
3432
+ - type: euclidean_spearman
3433
+ value: 0.7710175681583312
3434
+ - type: manhattan_pearson
3435
+ value: 0.7684527031837596
3436
+ - type: manhattan_spearman
3437
+ value: 0.7707043080084379
3438
+ - task:
3439
+ type: Clustering
3440
+ dataset:
3441
+ type: mteb/biorxiv-clustering-p2p
3442
+ name: MTEB BiorxivClusteringP2P
3443
+ metrics:
3444
+ - type: v_measure
3445
+ value: 0.2893427045246491
3446
+ - task:
3447
+ type: Clustering
3448
+ dataset:
3449
+ type: mteb/stackexchange-clustering-p2p
3450
+ name: MTEB StackExchangeClusteringP2P
3451
+ metrics:
3452
+ - type: v_measure
3453
+ value: 0.28230204578753637
3454
+ - task:
3455
+ type: Classification
3456
+ dataset:
3457
+ type: mteb/toxic_conversations_50k
3458
+ name: MTEB ToxicConversationsClassification
3459
+ metrics:
3460
+ - type: accuracy
3461
+ value: 0.627862
3462
+ - type: ap
3463
+ value: 0.10958454618347832
3464
+ - type: f1
3465
+ value: 0.48372434170467626
3466
+ - task:
3467
+ type: Clustering
3468
+ dataset:
3469
+ type: mteb/twentynewsgroups-clustering
3470
+ name: MTEB TwentyNewsgroupsClustering
3471
+ metrics:
3472
+ - type: v_measure
3473
+ value: 0.2824295128553035
3474
+ - task:
3475
+ type: PairClassification
3476
+ dataset:
3477
+ type: mteb/twittersemeval2015-pairclassification
3478
+ name: MTEB TwitterSemEval2015
3479
+ metrics:
3480
+ - type: cos_sim_accuracy
3481
+ value: 0.815640460153782
3482
+ - type: cos_sim_accuracy_threshold
3483
+ value: 0.7118978500366211
3484
+ - type: cos_sim_ap
3485
+ value: 0.5709409536692154
3486
+ - type: cos_sim_f1
3487
+ value: 0.5529607083563918
3488
+ - type: cos_sim_f1_threshold
3489
+ value: 0.5981647968292236
3490
+ - type: cos_sim_precision
3491
+ value: 0.47626310772163966
3492
+ - type: cos_sim_recall
3493
+ value: 0.6591029023746702
3494
+ - type: dot_accuracy
3495
+ value: 0.788162365142755
3496
+ - type: dot_accuracy_threshold
3497
+ value: 1049.799072265625
3498
+ - type: dot_ap
3499
+ value: 0.4742989400382077
3500
+ - type: dot_f1
3501
+ value: 0.5125944584382871
3502
+ - type: dot_f1_threshold
3503
+ value: 723.3736572265625
3504
+ - type: dot_precision
3505
+ value: 0.4255838271174625
3506
+ - type: dot_recall
3507
+ value: 0.6443271767810026
3508
+ - type: euclidean_accuracy
3509
+ value: 0.8029445073612684
3510
+ - type: euclidean_accuracy_threshold
3511
+ value: 26.134265899658203
3512
+ - type: euclidean_ap
3513
+ value: 0.5342012231336148
3514
+ - type: euclidean_f1
3515
+ value: 0.5186778356350464
3516
+ - type: euclidean_f1_threshold
3517
+ value: 31.25627326965332
3518
+ - type: euclidean_precision
3519
+ value: 0.454203013481364
3520
+ - type: euclidean_recall
3521
+ value: 0.604485488126649
3522
+ - type: manhattan_accuracy
3523
+ value: 0.802884901949097
3524
+ - type: manhattan_accuracy_threshold
3525
+ value: 560.0760498046875
3526
+ - type: manhattan_ap
3527
+ value: 0.5343205271323233
3528
+ - type: manhattan_f1
3529
+ value: 0.520141655599823
3530
+ - type: manhattan_f1_threshold
3531
+ value: 658.3975830078125
3532
+ - type: manhattan_precision
3533
+ value: 0.44796035074342355
3534
+ - type: manhattan_recall
3535
+ value: 0.6200527704485488
3536
+ - type: max_accuracy
3537
+ value: 0.815640460153782
3538
+ - type: max_ap
3539
+ value: 0.5709409536692154
3540
+ - type: max_f1
3541
+ value: 0.5529607083563918
3542
+ - task:
3543
+ type: Classification
3544
+ dataset:
3545
+ type: mteb/mtop_intent
3546
+ name: MTEB MTOPIntentClassification (en)
3547
+ metrics:
3548
+ - type: accuracy
3549
+ value: 0.582421340629275
3550
+ - type: f1
3551
+ value: 0.40116960466226426
3552
+ - task:
3553
+ type: Classification
3554
+ dataset:
3555
+ type: mteb/mtop_intent
3556
+ name: MTEB MTOPIntentClassification (de)
3557
+ metrics:
3558
+ - type: accuracy
3559
+ value: 0.4506903353057199
3560
+ - type: f1
3561
+ value: 0.30468468273374966
3562
+ - task:
3563
+ type: Classification
3564
+ dataset:
3565
+ type: mteb/mtop_intent
3566
+ name: MTEB MTOPIntentClassification (es)
3567
+ metrics:
3568
+ - type: accuracy
3569
+ value: 0.4880920613742495
3570
+ - type: f1
3571
+ value: 0.3265985375400447
3572
+ - task:
3573
+ type: Classification
3574
+ dataset:
3575
+ type: mteb/mtop_intent
3576
+ name: MTEB MTOPIntentClassification (fr)
3577
+ metrics:
3578
+ - type: accuracy
3579
+ value: 0.4433761352959599
3580
+ - type: f1
3581
+ value: 0.2930204743560644
3582
+ - task:
3583
+ type: Classification
3584
+ dataset:
3585
+ type: mteb/mtop_intent
3586
+ name: MTEB MTOPIntentClassification (hi)
3587
+ metrics:
3588
+ - type: accuracy
3589
+ value: 0.34198637504481894
3590
+ - type: f1
3591
+ value: 0.2206370603224841
3592
+ - task:
3593
+ type: Classification
3594
+ dataset:
3595
+ type: mteb/mtop_intent
3596
+ name: MTEB MTOPIntentClassification (th)
3597
+ metrics:
3598
+ - type: accuracy
3599
+ value: 0.4311030741410488
3600
+ - type: f1
3601
+ value: 0.2692408933648504
3602
+ - task:
3603
+ type: Clustering
3604
+ dataset:
3605
+ type: mteb/reddit-clustering
3606
+ name: MTEB RedditClustering
3607
+ metrics:
3608
+ - type: v_measure
3609
+ value: 0.3375741018380938
3610
+ - task:
3611
+ type: Retrieval
3612
+ dataset:
3613
+ type: BeIR/cqadupstack
3614
+ name: MTEB CQADupstackPhysicsRetrieval
3615
+ metrics:
3616
+ - type: map_at_1
3617
+ value: 0.13909
3618
+ - type: map_at_10
3619
+ value: 0.19256
3620
+ - type: map_at_100
3621
+ value: 0.20286
3622
+ - type: map_at_1000
3623
+ value: 0.20429
3624
+ - type: map_at_3
3625
+ value: 0.17399
3626
+ - type: map_at_5
3627
+ value: 0.18399
3628
+ - type: ndcg_at_1
3629
+ value: 0.17421
3630
+ - type: ndcg_at_10
3631
+ value: 0.23106
3632
+ - type: ndcg_at_100
3633
+ value: 0.28129
3634
+ - type: ndcg_at_1000
3635
+ value: 0.31481
3636
+ - type: ndcg_at_3
3637
+ value: 0.19789
3638
+ - type: ndcg_at_5
3639
+ value: 0.21237
3640
+ - type: precision_at_1
3641
+ value: 0.17421
3642
+ - type: precision_at_10
3643
+ value: 0.04331
3644
+ - type: precision_at_100
3645
+ value: 0.00839
3646
+ - type: precision_at_1000
3647
+ value: 0.00131
3648
+ - type: precision_at_3
3649
+ value: 0.094
3650
+ - type: precision_at_5
3651
+ value: 0.06776
3652
+ - type: recall_at_1
3653
+ value: 0.13909
3654
+ - type: recall_at_10
3655
+ value: 0.31087
3656
+ - type: recall_at_100
3657
+ value: 0.52946
3658
+ - type: recall_at_1000
3659
+ value: 0.76546
3660
+ - type: recall_at_3
3661
+ value: 0.21351
3662
+ - type: recall_at_5
3663
+ value: 0.25265
3664
+ - task:
3665
+ type: Reranking
3666
+ dataset:
3667
+ type: mteb/stackoverflowdupquestions-reranking
3668
+ name: MTEB StackOverflowDupQuestions
3669
+ metrics:
3670
+ - type: map
3671
+ value: 0.3996520488022785
3672
+ - type: mrr
3673
+ value: 0.40189248047703935
3674
+ - task:
3675
+ type: Retrieval
3676
+ dataset:
3677
+ type: BeIR/cqadupstack
3678
+ name: MTEB CQADupstackRetrieval
3679
+ metrics:
3680
+ - type: map_at_1
3681
+ value: 0.12738416666666666
3682
+ - type: map_at_10
3683
+ value: 0.17235916666666667
3684
+ - type: map_at_100
3685
+ value: 0.1806333333333333
3686
+ - type: map_at_1000
3687
+ value: 0.18184333333333333
3688
+ - type: map_at_3
3689
+ value: 0.1574775
3690
+ - type: map_at_5
3691
+ value: 0.1657825
3692
+ - type: ndcg_at_1
3693
+ value: 0.15487416666666665
3694
+ - type: ndcg_at_10
3695
+ value: 0.20290166666666667
3696
+ - type: ndcg_at_100
3697
+ value: 0.24412916666666662
3698
+ - type: ndcg_at_1000
3699
+ value: 0.27586333333333335
3700
+ - type: ndcg_at_3
3701
+ value: 0.17622083333333333
3702
+ - type: ndcg_at_5
3703
+ value: 0.18859916666666668
3704
+ - type: precision_at_1
3705
+ value: 0.15487416666666665
3706
+ - type: precision_at_10
3707
+ value: 0.036226666666666664
3708
+ - type: precision_at_100
3709
+ value: 0.006820833333333333
3710
+ - type: precision_at_1000
3711
+ value: 0.0011216666666666666
3712
+ - type: precision_at_3
3713
+ value: 0.08163749999999999
3714
+ - type: precision_at_5
3715
+ value: 0.058654166666666674
3716
+ - type: recall_at_1
3717
+ value: 0.12738416666666666
3718
+ - type: recall_at_10
3719
+ value: 0.26599416666666664
3720
+ - type: recall_at_100
3721
+ value: 0.4541258333333334
3722
+ - type: recall_at_1000
3723
+ value: 0.687565
3724
+ - type: recall_at_3
3725
+ value: 0.19008166666666668
3726
+ - type: recall_at_5
3727
+ value: 0.2224991666666667
3728
+ - task:
3729
+ type: PairClassification
3730
+ dataset:
3731
+ type: mteb/sprintduplicatequestions-pairclassification
3732
+ name: MTEB SprintDuplicateQuestions
3733
+ metrics:
3734
+ - type: cos_sim_accuracy
3735
+ value: 0.9949306930693069
3736
+ - type: cos_sim_accuracy_threshold
3737
+ value: 0.7870972752571106
3738
+ - type: cos_sim_ap
3739
+ value: 0.7773085502917281
3740
+ - type: cos_sim_f1
3741
+ value: 0.7178978681209718
3742
+ - type: cos_sim_f1_threshold
3743
+ value: 0.7572916746139526
3744
+ - type: cos_sim_precision
3745
+ value: 0.711897738446411
3746
+ - type: cos_sim_recall
3747
+ value: 0.724
3748
+ - type: dot_accuracy
3749
+ value: 0.9908118811881188
3750
+ - type: dot_accuracy_threshold
3751
+ value: 1571.5850830078125
3752
+ - type: dot_ap
3753
+ value: 0.30267748833368235
3754
+ - type: dot_f1
3755
+ value: 0.34335201222618444
3756
+ - type: dot_f1_threshold
3757
+ value: 1329.530029296875
3758
+ - type: dot_precision
3759
+ value: 0.34994807892004154
3760
+ - type: dot_recall
3761
+ value: 0.337
3762
+ - type: euclidean_accuracy
3763
+ value: 0.9951683168316832
3764
+ - type: euclidean_accuracy_threshold
3765
+ value: 25.715721130371094
3766
+ - type: euclidean_ap
3767
+ value: 0.7864498778235628
3768
+ - type: euclidean_f1
3769
+ value: 0.7309149972929074
3770
+ - type: euclidean_f1_threshold
3771
+ value: 26.336116790771484
3772
+ - type: euclidean_precision
3773
+ value: 0.7969303423848878
3774
+ - type: euclidean_recall
3775
+ value: 0.675
3776
+ - type: manhattan_accuracy
3777
+ value: 0.9953168316831683
3778
+ - type: manhattan_accuracy_threshold
3779
+ value: 534.224609375
3780
+ - type: manhattan_ap
3781
+ value: 0.7945274878693959
3782
+ - type: manhattan_f1
3783
+ value: 0.7419863373620599
3784
+ - type: manhattan_f1_threshold
3785
+ value: 562.244140625
3786
+ - type: manhattan_precision
3787
+ value: 0.7818383167220376
3788
+ - type: manhattan_recall
3789
+ value: 0.706
3790
+ - type: max_accuracy
3791
+ value: 0.9953168316831683
3792
+ - type: max_ap
3793
+ value: 0.7945274878693959
3794
+ - type: max_f1
3795
+ value: 0.7419863373620599
3796
+ - task:
3797
+ type: Retrieval
3798
+ dataset:
3799
+ type: BeIR/cqadupstack
3800
+ name: MTEB CQADupstackWordpressRetrieval
3801
+ metrics:
3802
+ - type: map_at_1
3803
+ value: 0.09057
3804
+ - type: map_at_10
3805
+ value: 0.12721
3806
+ - type: map_at_100
3807
+ value: 0.1345
3808
+ - type: map_at_1000
3809
+ value: 0.13564
3810
+ - type: map_at_3
3811
+ value: 0.1134
3812
+ - type: map_at_5
3813
+ value: 0.12245
3814
+ - type: ndcg_at_1
3815
+ value: 0.09797
3816
+ - type: ndcg_at_10
3817
+ value: 0.15091
3818
+ - type: ndcg_at_100
3819
+ value: 0.18886
3820
+ - type: ndcg_at_1000
3821
+ value: 0.2229
3822
+ - type: ndcg_at_3
3823
+ value: 0.12365
3824
+ - type: ndcg_at_5
3825
+ value: 0.13931
3826
+ - type: precision_at_1
3827
+ value: 0.09797
3828
+ - type: precision_at_10
3829
+ value: 0.02477
3830
+ - type: precision_at_100
3831
+ value: 0.00466
3832
+ - type: precision_at_1000
3833
+ value: 0.00082
3834
+ - type: precision_at_3
3835
+ value: 0.05299
3836
+ - type: precision_at_5
3837
+ value: 0.04067
3838
+ - type: recall_at_1
3839
+ value: 0.09057
3840
+ - type: recall_at_10
3841
+ value: 0.21319
3842
+ - type: recall_at_100
3843
+ value: 0.38999
3844
+ - type: recall_at_1000
3845
+ value: 0.65374
3846
+ - type: recall_at_3
3847
+ value: 0.14331
3848
+ - type: recall_at_5
3849
+ value: 0.17917
3850
  ---
3851
 
3852
  # SGPT-125M-weightedmean-nli-bitfit