spacemanidol commited on
Commit
639bf59
1 Parent(s): 3b0645c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3017 -1
README.md CHANGED
@@ -2799,4 +2799,3020 @@ model-index:
2799
  metrics:
2800
  - type: v_measure
2801
  value: 79.58576208710117
2802
- ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
2799
  metrics:
2800
  - type: v_measure
2801
  value: 79.58576208710117
2802
+ ---
2803
+ ---
2804
+ tags:
2805
+ - mteb
2806
+ - arctic
2807
+ - arctic-embed
2808
+ model-index:
2809
+ - name: base
2810
+ results:
2811
+ - task:
2812
+ type: Classification
2813
+ dataset:
2814
+ type: mteb/amazon_counterfactual
2815
+ name: MTEB AmazonCounterfactualClassification (en)
2816
+ config: en
2817
+ split: test
2818
+ revision: e8379541af4e31359cca9fbcf4b00f2671dba205
2819
+ metrics:
2820
+ - type: accuracy
2821
+ value: 76.80597014925374
2822
+ - type: ap
2823
+ value: 39.31198155789558
2824
+ - type: f1
2825
+ value: 70.48198448222148
2826
+ - task:
2827
+ type: Classification
2828
+ dataset:
2829
+ type: mteb/amazon_polarity
2830
+ name: MTEB AmazonPolarityClassification
2831
+ config: default
2832
+ split: test
2833
+ revision: e2d317d38cd51312af73b3d32a06d1a08b442046
2834
+ metrics:
2835
+ - type: accuracy
2836
+ value: 82.831525
2837
+ - type: ap
2838
+ value: 77.4474050181638
2839
+ - type: f1
2840
+ value: 82.77204845110204
2841
+ - task:
2842
+ type: Classification
2843
+ dataset:
2844
+ type: mteb/amazon_reviews_multi
2845
+ name: MTEB AmazonReviewsClassification (en)
2846
+ config: en
2847
+ split: test
2848
+ revision: 1399c76144fd37290681b995c656ef9b2e06e26d
2849
+ metrics:
2850
+ - type: accuracy
2851
+ value: 38.93000000000001
2852
+ - type: f1
2853
+ value: 37.98013371053459
2854
+ - task:
2855
+ type: Retrieval
2856
+ dataset:
2857
+ type: mteb/arguana
2858
+ name: MTEB ArguAna
2859
+ config: default
2860
+ split: test
2861
+ revision: c22ab2a51041ffd869aaddef7af8d8215647e41a
2862
+ metrics:
2863
+ - type: map_at_1
2864
+ value: 31.223
2865
+ - type: map_at_10
2866
+ value: 47.43
2867
+ - type: map_at_100
2868
+ value: 48.208
2869
+ - type: map_at_1000
2870
+ value: 48.211
2871
+ - type: map_at_3
2872
+ value: 42.579
2873
+ - type: map_at_5
2874
+ value: 45.263999999999996
2875
+ - type: mrr_at_1
2876
+ value: 31.65
2877
+ - type: mrr_at_10
2878
+ value: 47.573
2879
+ - type: mrr_at_100
2880
+ value: 48.359
2881
+ - type: mrr_at_1000
2882
+ value: 48.362
2883
+ - type: mrr_at_3
2884
+ value: 42.734
2885
+ - type: mrr_at_5
2886
+ value: 45.415
2887
+ - type: ndcg_at_1
2888
+ value: 31.223
2889
+ - type: ndcg_at_10
2890
+ value: 56.436
2891
+ - type: ndcg_at_100
2892
+ value: 59.657000000000004
2893
+ - type: ndcg_at_1000
2894
+ value: 59.731
2895
+ - type: ndcg_at_3
2896
+ value: 46.327
2897
+ - type: ndcg_at_5
2898
+ value: 51.178000000000004
2899
+ - type: precision_at_1
2900
+ value: 31.223
2901
+ - type: precision_at_10
2902
+ value: 8.527999999999999
2903
+ - type: precision_at_100
2904
+ value: 0.991
2905
+ - type: precision_at_1000
2906
+ value: 0.1
2907
+ - type: precision_at_3
2908
+ value: 19.061
2909
+ - type: precision_at_5
2910
+ value: 13.797999999999998
2911
+ - type: recall_at_1
2912
+ value: 31.223
2913
+ - type: recall_at_10
2914
+ value: 85.277
2915
+ - type: recall_at_100
2916
+ value: 99.075
2917
+ - type: recall_at_1000
2918
+ value: 99.644
2919
+ - type: recall_at_3
2920
+ value: 57.18299999999999
2921
+ - type: recall_at_5
2922
+ value: 68.99
2923
+ - task:
2924
+ type: Clustering
2925
+ dataset:
2926
+ type: mteb/arxiv-clustering-p2p
2927
+ name: MTEB ArxivClusteringP2P
2928
+ config: default
2929
+ split: test
2930
+ revision: a122ad7f3f0291bf49cc6f4d32aa80929df69d5d
2931
+ metrics:
2932
+ - type: v_measure
2933
+ value: 47.23625429411296
2934
+ - task:
2935
+ type: Clustering
2936
+ dataset:
2937
+ type: mteb/arxiv-clustering-s2s
2938
+ name: MTEB ArxivClusteringS2S
2939
+ config: default
2940
+ split: test
2941
+ revision: f910caf1a6075f7329cdf8c1a6135696f37dbd53
2942
+ metrics:
2943
+ - type: v_measure
2944
+ value: 37.433880471403654
2945
+ - task:
2946
+ type: Reranking
2947
+ dataset:
2948
+ type: mteb/askubuntudupquestions-reranking
2949
+ name: MTEB AskUbuntuDupQuestions
2950
+ config: default
2951
+ split: test
2952
+ revision: 2000358ca161889fa9c082cb41daa8dcfb161a54
2953
+ metrics:
2954
+ - type: map
2955
+ value: 60.53175025582013
2956
+ - type: mrr
2957
+ value: 74.51160796728664
2958
+ - task:
2959
+ type: STS
2960
+ dataset:
2961
+ type: mteb/biosses-sts
2962
+ name: MTEB BIOSSES
2963
+ config: default
2964
+ split: test
2965
+ revision: d3fb88f8f02e40887cd149695127462bbcf29b4a
2966
+ metrics:
2967
+ - type: cos_sim_pearson
2968
+ value: 88.93746103286769
2969
+ - type: cos_sim_spearman
2970
+ value: 86.62245567912619
2971
+ - type: euclidean_pearson
2972
+ value: 87.154173907501
2973
+ - type: euclidean_spearman
2974
+ value: 86.62245567912619
2975
+ - type: manhattan_pearson
2976
+ value: 87.17682026633462
2977
+ - type: manhattan_spearman
2978
+ value: 86.74775973908348
2979
+ - task:
2980
+ type: Classification
2981
+ dataset:
2982
+ type: mteb/banking77
2983
+ name: MTEB Banking77Classification
2984
+ config: default
2985
+ split: test
2986
+ revision: 0fd18e25b25c072e09e0d92ab615fda904d66300
2987
+ metrics:
2988
+ - type: accuracy
2989
+ value: 80.33766233766232
2990
+ - type: f1
2991
+ value: 79.64931422442245
2992
+ - task:
2993
+ type: Clustering
2994
+ dataset:
2995
+ type: jinaai/big-patent-clustering
2996
+ name: MTEB BigPatentClustering
2997
+ config: default
2998
+ split: test
2999
+ revision: 62d5330920bca426ce9d3c76ea914f15fc83e891
3000
+ metrics:
3001
+ - type: v_measure
3002
+ value: 19.116028913890613
3003
+ - task:
3004
+ type: Clustering
3005
+ dataset:
3006
+ type: mteb/biorxiv-clustering-p2p
3007
+ name: MTEB BiorxivClusteringP2P
3008
+ config: default
3009
+ split: test
3010
+ revision: 65b79d1d13f80053f67aca9498d9402c2d9f1f40
3011
+ metrics:
3012
+ - type: v_measure
3013
+ value: 36.966921852810174
3014
+ - task:
3015
+ type: Clustering
3016
+ dataset:
3017
+ type: mteb/biorxiv-clustering-s2s
3018
+ name: MTEB BiorxivClusteringS2S
3019
+ config: default
3020
+ split: test
3021
+ revision: 258694dd0231531bc1fd9de6ceb52a0853c6d908
3022
+ metrics:
3023
+ - type: v_measure
3024
+ value: 31.98019698537654
3025
+ - task:
3026
+ type: Retrieval
3027
+ dataset:
3028
+ type: mteb/cqadupstack-android
3029
+ name: MTEB CQADupstackAndroidRetrieval
3030
+ config: default
3031
+ split: test
3032
+ revision: f46a197baaae43b4f621051089b82a364682dfeb
3033
+ metrics:
3034
+ - type: map_at_1
3035
+ value: 34.079
3036
+ - type: map_at_10
3037
+ value: 46.35
3038
+ - type: map_at_100
3039
+ value: 47.785
3040
+ - type: map_at_1000
3041
+ value: 47.903
3042
+ - type: map_at_3
3043
+ value: 42.620999999999995
3044
+ - type: map_at_5
3045
+ value: 44.765
3046
+ - type: mrr_at_1
3047
+ value: 41.345
3048
+ - type: mrr_at_10
3049
+ value: 52.032000000000004
3050
+ - type: mrr_at_100
3051
+ value: 52.690000000000005
3052
+ - type: mrr_at_1000
3053
+ value: 52.727999999999994
3054
+ - type: mrr_at_3
3055
+ value: 49.428
3056
+ - type: mrr_at_5
3057
+ value: 51.093999999999994
3058
+ - type: ndcg_at_1
3059
+ value: 41.345
3060
+ - type: ndcg_at_10
3061
+ value: 53.027
3062
+ - type: ndcg_at_100
3063
+ value: 57.962
3064
+ - type: ndcg_at_1000
3065
+ value: 59.611999999999995
3066
+ - type: ndcg_at_3
3067
+ value: 47.687000000000005
3068
+ - type: ndcg_at_5
3069
+ value: 50.367
3070
+ - type: precision_at_1
3071
+ value: 41.345
3072
+ - type: precision_at_10
3073
+ value: 10.157
3074
+ - type: precision_at_100
3075
+ value: 1.567
3076
+ - type: precision_at_1000
3077
+ value: 0.199
3078
+ - type: precision_at_3
3079
+ value: 23.081
3080
+ - type: precision_at_5
3081
+ value: 16.738
3082
+ - type: recall_at_1
3083
+ value: 34.079
3084
+ - type: recall_at_10
3085
+ value: 65.93900000000001
3086
+ - type: recall_at_100
3087
+ value: 86.42699999999999
3088
+ - type: recall_at_1000
3089
+ value: 96.61
3090
+ - type: recall_at_3
3091
+ value: 50.56699999999999
3092
+ - type: recall_at_5
3093
+ value: 57.82000000000001
3094
+ - task:
3095
+ type: Retrieval
3096
+ dataset:
3097
+ type: mteb/cqadupstack-english
3098
+ name: MTEB CQADupstackEnglishRetrieval
3099
+ config: default
3100
+ split: test
3101
+ revision: ad9991cb51e31e31e430383c75ffb2885547b5f0
3102
+ metrics:
3103
+ - type: map_at_1
3104
+ value: 33.289
3105
+ - type: map_at_10
3106
+ value: 43.681
3107
+ - type: map_at_100
3108
+ value: 45.056000000000004
3109
+ - type: map_at_1000
3110
+ value: 45.171
3111
+ - type: map_at_3
3112
+ value: 40.702
3113
+ - type: map_at_5
3114
+ value: 42.292
3115
+ - type: mrr_at_1
3116
+ value: 41.146
3117
+ - type: mrr_at_10
3118
+ value: 49.604
3119
+ - type: mrr_at_100
3120
+ value: 50.28399999999999
3121
+ - type: mrr_at_1000
3122
+ value: 50.322
3123
+ - type: mrr_at_3
3124
+ value: 47.611
3125
+ - type: mrr_at_5
3126
+ value: 48.717
3127
+ - type: ndcg_at_1
3128
+ value: 41.146
3129
+ - type: ndcg_at_10
3130
+ value: 49.43
3131
+ - type: ndcg_at_100
3132
+ value: 54.01899999999999
3133
+ - type: ndcg_at_1000
3134
+ value: 55.803000000000004
3135
+ - type: ndcg_at_3
3136
+ value: 45.503
3137
+ - type: ndcg_at_5
3138
+ value: 47.198
3139
+ - type: precision_at_1
3140
+ value: 41.146
3141
+ - type: precision_at_10
3142
+ value: 9.268
3143
+ - type: precision_at_100
3144
+ value: 1.4749999999999999
3145
+ - type: precision_at_1000
3146
+ value: 0.19
3147
+ - type: precision_at_3
3148
+ value: 21.932
3149
+ - type: precision_at_5
3150
+ value: 15.389
3151
+ - type: recall_at_1
3152
+ value: 33.289
3153
+ - type: recall_at_10
3154
+ value: 59.209999999999994
3155
+ - type: recall_at_100
3156
+ value: 78.676
3157
+ - type: recall_at_1000
3158
+ value: 89.84100000000001
3159
+ - type: recall_at_3
3160
+ value: 47.351
3161
+ - type: recall_at_5
3162
+ value: 52.178999999999995
3163
+ - task:
3164
+ type: Retrieval
3165
+ dataset:
3166
+ type: mteb/cqadupstack-gaming
3167
+ name: MTEB CQADupstackGamingRetrieval
3168
+ config: default
3169
+ split: test
3170
+ revision: 4885aa143210c98657558c04aaf3dc47cfb54340
3171
+ metrics:
3172
+ - type: map_at_1
3173
+ value: 44.483
3174
+ - type: map_at_10
3175
+ value: 56.862
3176
+ - type: map_at_100
3177
+ value: 57.901
3178
+ - type: map_at_1000
3179
+ value: 57.948
3180
+ - type: map_at_3
3181
+ value: 53.737
3182
+ - type: map_at_5
3183
+ value: 55.64
3184
+ - type: mrr_at_1
3185
+ value: 50.658
3186
+ - type: mrr_at_10
3187
+ value: 60.281
3188
+ - type: mrr_at_100
3189
+ value: 60.946
3190
+ - type: mrr_at_1000
3191
+ value: 60.967000000000006
3192
+ - type: mrr_at_3
3193
+ value: 58.192
3194
+ - type: mrr_at_5
3195
+ value: 59.531
3196
+ - type: ndcg_at_1
3197
+ value: 50.658
3198
+ - type: ndcg_at_10
3199
+ value: 62.339
3200
+ - type: ndcg_at_100
3201
+ value: 66.28399999999999
3202
+ - type: ndcg_at_1000
3203
+ value: 67.166
3204
+ - type: ndcg_at_3
3205
+ value: 57.458
3206
+ - type: ndcg_at_5
3207
+ value: 60.112
3208
+ - type: precision_at_1
3209
+ value: 50.658
3210
+ - type: precision_at_10
3211
+ value: 9.762
3212
+ - type: precision_at_100
3213
+ value: 1.26
3214
+ - type: precision_at_1000
3215
+ value: 0.13799999999999998
3216
+ - type: precision_at_3
3217
+ value: 25.329
3218
+ - type: precision_at_5
3219
+ value: 17.254
3220
+ - type: recall_at_1
3221
+ value: 44.483
3222
+ - type: recall_at_10
3223
+ value: 74.819
3224
+ - type: recall_at_100
3225
+ value: 91.702
3226
+ - type: recall_at_1000
3227
+ value: 97.84
3228
+ - type: recall_at_3
3229
+ value: 62.13999999999999
3230
+ - type: recall_at_5
3231
+ value: 68.569
3232
+ - task:
3233
+ type: Retrieval
3234
+ dataset:
3235
+ type: mteb/cqadupstack-gis
3236
+ name: MTEB CQADupstackGisRetrieval
3237
+ config: default
3238
+ split: test
3239
+ revision: 5003b3064772da1887988e05400cf3806fe491f2
3240
+ metrics:
3241
+ - type: map_at_1
3242
+ value: 26.489
3243
+ - type: map_at_10
3244
+ value: 37.004999999999995
3245
+ - type: map_at_100
3246
+ value: 38.001000000000005
3247
+ - type: map_at_1000
3248
+ value: 38.085
3249
+ - type: map_at_3
3250
+ value: 34.239999999999995
3251
+ - type: map_at_5
3252
+ value: 35.934
3253
+ - type: mrr_at_1
3254
+ value: 28.362
3255
+ - type: mrr_at_10
3256
+ value: 38.807
3257
+ - type: mrr_at_100
3258
+ value: 39.671
3259
+ - type: mrr_at_1000
3260
+ value: 39.736
3261
+ - type: mrr_at_3
3262
+ value: 36.29
3263
+ - type: mrr_at_5
3264
+ value: 37.906
3265
+ - type: ndcg_at_1
3266
+ value: 28.362
3267
+ - type: ndcg_at_10
3268
+ value: 42.510999999999996
3269
+ - type: ndcg_at_100
3270
+ value: 47.226
3271
+ - type: ndcg_at_1000
3272
+ value: 49.226
3273
+ - type: ndcg_at_3
3274
+ value: 37.295
3275
+ - type: ndcg_at_5
3276
+ value: 40.165
3277
+ - type: precision_at_1
3278
+ value: 28.362
3279
+ - type: precision_at_10
3280
+ value: 6.633
3281
+ - type: precision_at_100
3282
+ value: 0.9490000000000001
3283
+ - type: precision_at_1000
3284
+ value: 0.11499999999999999
3285
+ - type: precision_at_3
3286
+ value: 16.234
3287
+ - type: precision_at_5
3288
+ value: 11.434999999999999
3289
+ - type: recall_at_1
3290
+ value: 26.489
3291
+ - type: recall_at_10
3292
+ value: 57.457
3293
+ - type: recall_at_100
3294
+ value: 78.712
3295
+ - type: recall_at_1000
3296
+ value: 93.565
3297
+ - type: recall_at_3
3298
+ value: 43.748
3299
+ - type: recall_at_5
3300
+ value: 50.589
3301
+ - task:
3302
+ type: Retrieval
3303
+ dataset:
3304
+ type: mteb/cqadupstack-mathematica
3305
+ name: MTEB CQADupstackMathematicaRetrieval
3306
+ config: default
3307
+ split: test
3308
+ revision: 90fceea13679c63fe563ded68f3b6f06e50061de
3309
+ metrics:
3310
+ - type: map_at_1
3311
+ value: 12.418999999999999
3312
+ - type: map_at_10
3313
+ value: 22.866
3314
+ - type: map_at_100
3315
+ value: 24.365000000000002
3316
+ - type: map_at_1000
3317
+ value: 24.479
3318
+ - type: map_at_3
3319
+ value: 19.965
3320
+ - type: map_at_5
3321
+ value: 21.684
3322
+ - type: mrr_at_1
3323
+ value: 14.677000000000001
3324
+ - type: mrr_at_10
3325
+ value: 26.316
3326
+ - type: mrr_at_100
3327
+ value: 27.514
3328
+ - type: mrr_at_1000
3329
+ value: 27.57
3330
+ - type: mrr_at_3
3331
+ value: 23.3
3332
+ - type: mrr_at_5
3333
+ value: 25.191000000000003
3334
+ - type: ndcg_at_1
3335
+ value: 14.677000000000001
3336
+ - type: ndcg_at_10
3337
+ value: 28.875
3338
+ - type: ndcg_at_100
3339
+ value: 35.607
3340
+ - type: ndcg_at_1000
3341
+ value: 38.237
3342
+ - type: ndcg_at_3
3343
+ value: 23.284
3344
+ - type: ndcg_at_5
3345
+ value: 26.226
3346
+ - type: precision_at_1
3347
+ value: 14.677000000000001
3348
+ - type: precision_at_10
3349
+ value: 5.771
3350
+ - type: precision_at_100
3351
+ value: 1.058
3352
+ - type: precision_at_1000
3353
+ value: 0.14200000000000002
3354
+ - type: precision_at_3
3355
+ value: 11.940000000000001
3356
+ - type: precision_at_5
3357
+ value: 9.229
3358
+ - type: recall_at_1
3359
+ value: 12.418999999999999
3360
+ - type: recall_at_10
3361
+ value: 43.333
3362
+ - type: recall_at_100
3363
+ value: 71.942
3364
+ - type: recall_at_1000
3365
+ value: 90.67399999999999
3366
+ - type: recall_at_3
3367
+ value: 28.787000000000003
3368
+ - type: recall_at_5
3369
+ value: 35.638
3370
+ - task:
3371
+ type: Retrieval
3372
+ dataset:
3373
+ type: mteb/cqadupstack-physics
3374
+ name: MTEB CQADupstackPhysicsRetrieval
3375
+ config: default
3376
+ split: test
3377
+ revision: 79531abbd1fb92d06c6d6315a0cbbbf5bb247ea4
3378
+ metrics:
3379
+ - type: map_at_1
3380
+ value: 31.686999999999998
3381
+ - type: map_at_10
3382
+ value: 42.331
3383
+ - type: map_at_100
3384
+ value: 43.655
3385
+ - type: map_at_1000
3386
+ value: 43.771
3387
+ - type: map_at_3
3388
+ value: 38.944
3389
+ - type: map_at_5
3390
+ value: 40.991
3391
+ - type: mrr_at_1
3392
+ value: 37.921
3393
+ - type: mrr_at_10
3394
+ value: 47.534
3395
+ - type: mrr_at_100
3396
+ value: 48.362
3397
+ - type: mrr_at_1000
3398
+ value: 48.405
3399
+ - type: mrr_at_3
3400
+ value: 44.995000000000005
3401
+ - type: mrr_at_5
3402
+ value: 46.617
3403
+ - type: ndcg_at_1
3404
+ value: 37.921
3405
+ - type: ndcg_at_10
3406
+ value: 48.236000000000004
3407
+ - type: ndcg_at_100
3408
+ value: 53.705000000000005
3409
+ - type: ndcg_at_1000
3410
+ value: 55.596000000000004
3411
+ - type: ndcg_at_3
3412
+ value: 43.11
3413
+ - type: ndcg_at_5
3414
+ value: 45.862
3415
+ - type: precision_at_1
3416
+ value: 37.921
3417
+ - type: precision_at_10
3418
+ value: 8.643
3419
+ - type: precision_at_100
3420
+ value: 1.336
3421
+ - type: precision_at_1000
3422
+ value: 0.166
3423
+ - type: precision_at_3
3424
+ value: 20.308
3425
+ - type: precision_at_5
3426
+ value: 14.514
3427
+ - type: recall_at_1
3428
+ value: 31.686999999999998
3429
+ - type: recall_at_10
3430
+ value: 60.126999999999995
3431
+ - type: recall_at_100
3432
+ value: 83.10600000000001
3433
+ - type: recall_at_1000
3434
+ value: 95.15
3435
+ - type: recall_at_3
3436
+ value: 46.098
3437
+ - type: recall_at_5
3438
+ value: 53.179
3439
+ - task:
3440
+ type: Retrieval
3441
+ dataset:
3442
+ type: mteb/cqadupstack-programmers
3443
+ name: MTEB CQADupstackProgrammersRetrieval
3444
+ config: default
3445
+ split: test
3446
+ revision: 6184bc1440d2dbc7612be22b50686b8826d22b32
3447
+ metrics:
3448
+ - type: map_at_1
3449
+ value: 28.686
3450
+ - type: map_at_10
3451
+ value: 39.146
3452
+ - type: map_at_100
3453
+ value: 40.543
3454
+ - type: map_at_1000
3455
+ value: 40.644999999999996
3456
+ - type: map_at_3
3457
+ value: 36.195
3458
+ - type: map_at_5
3459
+ value: 37.919000000000004
3460
+ - type: mrr_at_1
3461
+ value: 35.160000000000004
3462
+ - type: mrr_at_10
3463
+ value: 44.711
3464
+ - type: mrr_at_100
3465
+ value: 45.609
3466
+ - type: mrr_at_1000
3467
+ value: 45.655
3468
+ - type: mrr_at_3
3469
+ value: 42.409
3470
+ - type: mrr_at_5
3471
+ value: 43.779
3472
+ - type: ndcg_at_1
3473
+ value: 35.160000000000004
3474
+ - type: ndcg_at_10
3475
+ value: 44.977000000000004
3476
+ - type: ndcg_at_100
3477
+ value: 50.663000000000004
3478
+ - type: ndcg_at_1000
3479
+ value: 52.794
3480
+ - type: ndcg_at_3
3481
+ value: 40.532000000000004
3482
+ - type: ndcg_at_5
3483
+ value: 42.641
3484
+ - type: precision_at_1
3485
+ value: 35.160000000000004
3486
+ - type: precision_at_10
3487
+ value: 8.014000000000001
3488
+ - type: precision_at_100
3489
+ value: 1.269
3490
+ - type: precision_at_1000
3491
+ value: 0.163
3492
+ - type: precision_at_3
3493
+ value: 19.444
3494
+ - type: precision_at_5
3495
+ value: 13.653
3496
+ - type: recall_at_1
3497
+ value: 28.686
3498
+ - type: recall_at_10
3499
+ value: 56.801
3500
+ - type: recall_at_100
3501
+ value: 80.559
3502
+ - type: recall_at_1000
3503
+ value: 95.052
3504
+ - type: recall_at_3
3505
+ value: 43.675999999999995
3506
+ - type: recall_at_5
3507
+ value: 49.703
3508
+ - task:
3509
+ type: Retrieval
3510
+ dataset:
3511
+ type: mteb/cqadupstack
3512
+ name: MTEB CQADupstackRetrieval
3513
+ config: default
3514
+ split: test
3515
+ revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
3516
+ metrics:
3517
+ - type: map_at_1
3518
+ value: 28.173833333333338
3519
+ - type: map_at_10
3520
+ value: 38.202083333333334
3521
+ - type: map_at_100
3522
+ value: 39.47475
3523
+ - type: map_at_1000
3524
+ value: 39.586499999999994
3525
+ - type: map_at_3
3526
+ value: 35.17308333333334
3527
+ - type: map_at_5
3528
+ value: 36.914
3529
+ - type: mrr_at_1
3530
+ value: 32.92958333333333
3531
+ - type: mrr_at_10
3532
+ value: 42.16758333333333
3533
+ - type: mrr_at_100
3534
+ value: 43.04108333333333
3535
+ - type: mrr_at_1000
3536
+ value: 43.092499999999994
3537
+ - type: mrr_at_3
3538
+ value: 39.69166666666666
3539
+ - type: mrr_at_5
3540
+ value: 41.19458333333333
3541
+ - type: ndcg_at_1
3542
+ value: 32.92958333333333
3543
+ - type: ndcg_at_10
3544
+ value: 43.80583333333333
3545
+ - type: ndcg_at_100
3546
+ value: 49.060916666666664
3547
+ - type: ndcg_at_1000
3548
+ value: 51.127250000000004
3549
+ - type: ndcg_at_3
3550
+ value: 38.80383333333333
3551
+ - type: ndcg_at_5
3552
+ value: 41.29658333333333
3553
+ - type: precision_at_1
3554
+ value: 32.92958333333333
3555
+ - type: precision_at_10
3556
+ value: 7.655666666666666
3557
+ - type: precision_at_100
3558
+ value: 1.2094166666666668
3559
+ - type: precision_at_1000
3560
+ value: 0.15750000000000003
3561
+ - type: precision_at_3
3562
+ value: 17.87975
3563
+ - type: precision_at_5
3564
+ value: 12.741833333333332
3565
+ - type: recall_at_1
3566
+ value: 28.173833333333338
3567
+ - type: recall_at_10
3568
+ value: 56.219249999999995
3569
+ - type: recall_at_100
3570
+ value: 79.01416666666665
3571
+ - type: recall_at_1000
3572
+ value: 93.13425000000001
3573
+ - type: recall_at_3
3574
+ value: 42.39241666666667
3575
+ - type: recall_at_5
3576
+ value: 48.764833333333335
3577
+ - task:
3578
+ type: Retrieval
3579
+ dataset:
3580
+ type: mteb/cqadupstack-stats
3581
+ name: MTEB CQADupstackStatsRetrieval
3582
+ config: default
3583
+ split: test
3584
+ revision: 65ac3a16b8e91f9cee4c9828cc7c335575432a2a
3585
+ metrics:
3586
+ - type: map_at_1
3587
+ value: 25.625999999999998
3588
+ - type: map_at_10
3589
+ value: 32.808
3590
+ - type: map_at_100
3591
+ value: 33.951
3592
+ - type: map_at_1000
3593
+ value: 34.052
3594
+ - type: map_at_3
3595
+ value: 30.536
3596
+ - type: map_at_5
3597
+ value: 31.77
3598
+ - type: mrr_at_1
3599
+ value: 28.374
3600
+ - type: mrr_at_10
3601
+ value: 35.527
3602
+ - type: mrr_at_100
3603
+ value: 36.451
3604
+ - type: mrr_at_1000
3605
+ value: 36.522
3606
+ - type: mrr_at_3
3607
+ value: 33.410000000000004
3608
+ - type: mrr_at_5
3609
+ value: 34.537
3610
+ - type: ndcg_at_1
3611
+ value: 28.374
3612
+ - type: ndcg_at_10
3613
+ value: 37.172
3614
+ - type: ndcg_at_100
3615
+ value: 42.474000000000004
3616
+ - type: ndcg_at_1000
3617
+ value: 44.853
3618
+ - type: ndcg_at_3
3619
+ value: 32.931
3620
+ - type: ndcg_at_5
3621
+ value: 34.882999999999996
3622
+ - type: precision_at_1
3623
+ value: 28.374
3624
+ - type: precision_at_10
3625
+ value: 5.813
3626
+ - type: precision_at_100
3627
+ value: 0.928
3628
+ - type: precision_at_1000
3629
+ value: 0.121
3630
+ - type: precision_at_3
3631
+ value: 14.008000000000001
3632
+ - type: precision_at_5
3633
+ value: 9.754999999999999
3634
+ - type: recall_at_1
3635
+ value: 25.625999999999998
3636
+ - type: recall_at_10
3637
+ value: 47.812
3638
+ - type: recall_at_100
3639
+ value: 71.61800000000001
3640
+ - type: recall_at_1000
3641
+ value: 88.881
3642
+ - type: recall_at_3
3643
+ value: 35.876999999999995
3644
+ - type: recall_at_5
3645
+ value: 40.839
3646
+ - task:
3647
+ type: Retrieval
3648
+ dataset:
3649
+ type: mteb/cqadupstack-tex
3650
+ name: MTEB CQADupstackTexRetrieval
3651
+ config: default
3652
+ split: test
3653
+ revision: 46989137a86843e03a6195de44b09deda022eec7
3654
+ metrics:
3655
+ - type: map_at_1
3656
+ value: 18.233
3657
+ - type: map_at_10
3658
+ value: 26.375999999999998
3659
+ - type: map_at_100
3660
+ value: 27.575
3661
+ - type: map_at_1000
3662
+ value: 27.706999999999997
3663
+ - type: map_at_3
3664
+ value: 23.619
3665
+ - type: map_at_5
3666
+ value: 25.217
3667
+ - type: mrr_at_1
3668
+ value: 22.023
3669
+ - type: mrr_at_10
3670
+ value: 30.122
3671
+ - type: mrr_at_100
3672
+ value: 31.083
3673
+ - type: mrr_at_1000
3674
+ value: 31.163999999999998
3675
+ - type: mrr_at_3
3676
+ value: 27.541
3677
+ - type: mrr_at_5
3678
+ value: 29.061999999999998
3679
+ - type: ndcg_at_1
3680
+ value: 22.023
3681
+ - type: ndcg_at_10
3682
+ value: 31.476
3683
+ - type: ndcg_at_100
3684
+ value: 37.114000000000004
3685
+ - type: ndcg_at_1000
3686
+ value: 39.981
3687
+ - type: ndcg_at_3
3688
+ value: 26.538
3689
+ - type: ndcg_at_5
3690
+ value: 29.016
3691
+ - type: precision_at_1
3692
+ value: 22.023
3693
+ - type: precision_at_10
3694
+ value: 5.819
3695
+ - type: precision_at_100
3696
+ value: 1.018
3697
+ - type: precision_at_1000
3698
+ value: 0.14300000000000002
3699
+ - type: precision_at_3
3700
+ value: 12.583
3701
+ - type: precision_at_5
3702
+ value: 9.36
3703
+ - type: recall_at_1
3704
+ value: 18.233
3705
+ - type: recall_at_10
3706
+ value: 43.029
3707
+ - type: recall_at_100
3708
+ value: 68.253
3709
+ - type: recall_at_1000
3710
+ value: 88.319
3711
+ - type: recall_at_3
3712
+ value: 29.541
3713
+ - type: recall_at_5
3714
+ value: 35.783
3715
+ - task:
3716
+ type: Retrieval
3717
+ dataset:
3718
+ type: mteb/cqadupstack-unix
3719
+ name: MTEB CQADupstackUnixRetrieval
3720
+ config: default
3721
+ split: test
3722
+ revision: 6c6430d3a6d36f8d2a829195bc5dc94d7e063e53
3723
+ metrics:
3724
+ - type: map_at_1
3725
+ value: 28.923
3726
+ - type: map_at_10
3727
+ value: 39.231
3728
+ - type: map_at_100
3729
+ value: 40.483000000000004
3730
+ - type: map_at_1000
3731
+ value: 40.575
3732
+ - type: map_at_3
3733
+ value: 35.94
3734
+ - type: map_at_5
3735
+ value: 37.683
3736
+ - type: mrr_at_1
3737
+ value: 33.955
3738
+ - type: mrr_at_10
3739
+ value: 43.163000000000004
3740
+ - type: mrr_at_100
3741
+ value: 44.054
3742
+ - type: mrr_at_1000
3743
+ value: 44.099
3744
+ - type: mrr_at_3
3745
+ value: 40.361000000000004
3746
+ - type: mrr_at_5
3747
+ value: 41.905
3748
+ - type: ndcg_at_1
3749
+ value: 33.955
3750
+ - type: ndcg_at_10
3751
+ value: 45.068000000000005
3752
+ - type: ndcg_at_100
3753
+ value: 50.470000000000006
3754
+ - type: ndcg_at_1000
3755
+ value: 52.349000000000004
3756
+ - type: ndcg_at_3
3757
+ value: 39.298
3758
+ - type: ndcg_at_5
3759
+ value: 41.821999999999996
3760
+ - type: precision_at_1
3761
+ value: 33.955
3762
+ - type: precision_at_10
3763
+ value: 7.649
3764
+ - type: precision_at_100
3765
+ value: 1.173
3766
+ - type: precision_at_1000
3767
+ value: 0.14200000000000002
3768
+ - type: precision_at_3
3769
+ value: 17.817
3770
+ - type: precision_at_5
3771
+ value: 12.537
3772
+ - type: recall_at_1
3773
+ value: 28.923
3774
+ - type: recall_at_10
3775
+ value: 58.934
3776
+ - type: recall_at_100
3777
+ value: 81.809
3778
+ - type: recall_at_1000
3779
+ value: 94.71300000000001
3780
+ - type: recall_at_3
3781
+ value: 42.975
3782
+ - type: recall_at_5
3783
+ value: 49.501
3784
+ - task:
3785
+ type: Retrieval
3786
+ dataset:
3787
+ type: mteb/cqadupstack-webmasters
3788
+ name: MTEB CQADupstackWebmastersRetrieval
3789
+ config: default
3790
+ split: test
3791
+ revision: 160c094312a0e1facb97e55eeddb698c0abe3571
3792
+ metrics:
3793
+ - type: map_at_1
3794
+ value: 28.596
3795
+ - type: map_at_10
3796
+ value: 38.735
3797
+ - type: map_at_100
3798
+ value: 40.264
3799
+ - type: map_at_1000
3800
+ value: 40.48
3801
+ - type: map_at_3
3802
+ value: 35.394999999999996
3803
+ - type: map_at_5
3804
+ value: 37.099
3805
+ - type: mrr_at_1
3806
+ value: 33.992
3807
+ - type: mrr_at_10
3808
+ value: 43.076
3809
+ - type: mrr_at_100
3810
+ value: 44.005
3811
+ - type: mrr_at_1000
3812
+ value: 44.043
3813
+ - type: mrr_at_3
3814
+ value: 40.415
3815
+ - type: mrr_at_5
3816
+ value: 41.957
3817
+ - type: ndcg_at_1
3818
+ value: 33.992
3819
+ - type: ndcg_at_10
3820
+ value: 44.896
3821
+ - type: ndcg_at_100
3822
+ value: 50.44499999999999
3823
+ - type: ndcg_at_1000
3824
+ value: 52.675000000000004
3825
+ - type: ndcg_at_3
3826
+ value: 39.783
3827
+ - type: ndcg_at_5
3828
+ value: 41.997
3829
+ - type: precision_at_1
3830
+ value: 33.992
3831
+ - type: precision_at_10
3832
+ value: 8.498
3833
+ - type: precision_at_100
3834
+ value: 1.585
3835
+ - type: precision_at_1000
3836
+ value: 0.248
3837
+ - type: precision_at_3
3838
+ value: 18.511
3839
+ - type: precision_at_5
3840
+ value: 13.241
3841
+ - type: recall_at_1
3842
+ value: 28.596
3843
+ - type: recall_at_10
3844
+ value: 56.885
3845
+ - type: recall_at_100
3846
+ value: 82.306
3847
+ - type: recall_at_1000
3848
+ value: 95.813
3849
+ - type: recall_at_3
3850
+ value: 42.168
3851
+ - type: recall_at_5
3852
+ value: 48.32
3853
+ - task:
3854
+ type: Retrieval
3855
+ dataset:
3856
+ type: mteb/cqadupstack-wordpress
3857
+ name: MTEB CQADupstackWordpressRetrieval
3858
+ config: default
3859
+ split: test
3860
+ revision: 4ffe81d471b1924886b33c7567bfb200e9eec5c4
3861
+ metrics:
3862
+ - type: map_at_1
3863
+ value: 25.576
3864
+ - type: map_at_10
3865
+ value: 33.034
3866
+ - type: map_at_100
3867
+ value: 34.117999999999995
3868
+ - type: map_at_1000
3869
+ value: 34.222
3870
+ - type: map_at_3
3871
+ value: 30.183
3872
+ - type: map_at_5
3873
+ value: 31.974000000000004
3874
+ - type: mrr_at_1
3875
+ value: 27.542
3876
+ - type: mrr_at_10
3877
+ value: 34.838
3878
+ - type: mrr_at_100
3879
+ value: 35.824
3880
+ - type: mrr_at_1000
3881
+ value: 35.899
3882
+ - type: mrr_at_3
3883
+ value: 32.348
3884
+ - type: mrr_at_5
3885
+ value: 34.039
3886
+ - type: ndcg_at_1
3887
+ value: 27.542
3888
+ - type: ndcg_at_10
3889
+ value: 37.663000000000004
3890
+ - type: ndcg_at_100
3891
+ value: 42.762
3892
+ - type: ndcg_at_1000
3893
+ value: 45.235
3894
+ - type: ndcg_at_3
3895
+ value: 32.227
3896
+ - type: ndcg_at_5
3897
+ value: 35.27
3898
+ - type: precision_at_1
3899
+ value: 27.542
3900
+ - type: precision_at_10
3901
+ value: 5.840999999999999
3902
+ - type: precision_at_100
3903
+ value: 0.895
3904
+ - type: precision_at_1000
3905
+ value: 0.123
3906
+ - type: precision_at_3
3907
+ value: 13.370000000000001
3908
+ - type: precision_at_5
3909
+ value: 9.797
3910
+ - type: recall_at_1
3911
+ value: 25.576
3912
+ - type: recall_at_10
3913
+ value: 50.285000000000004
3914
+ - type: recall_at_100
3915
+ value: 73.06
3916
+ - type: recall_at_1000
3917
+ value: 91.15299999999999
3918
+ - type: recall_at_3
3919
+ value: 35.781
3920
+ - type: recall_at_5
3921
+ value: 43.058
3922
+ - task:
3923
+ type: Retrieval
3924
+ dataset:
3925
+ type: mteb/climate-fever
3926
+ name: MTEB ClimateFEVER
3927
+ config: default
3928
+ split: test
3929
+ revision: 47f2ac6acb640fc46020b02a5b59fdda04d39380
3930
+ metrics:
3931
+ - type: map_at_1
3932
+ value: 17.061
3933
+ - type: map_at_10
3934
+ value: 29.464000000000002
3935
+ - type: map_at_100
3936
+ value: 31.552999999999997
3937
+ - type: map_at_1000
3938
+ value: 31.707
3939
+ - type: map_at_3
3940
+ value: 24.834999999999997
3941
+ - type: map_at_5
3942
+ value: 27.355
3943
+ - type: mrr_at_1
3944
+ value: 38.958
3945
+ - type: mrr_at_10
3946
+ value: 51.578
3947
+ - type: mrr_at_100
3948
+ value: 52.262
3949
+ - type: mrr_at_1000
3950
+ value: 52.283
3951
+ - type: mrr_at_3
3952
+ value: 48.599
3953
+ - type: mrr_at_5
3954
+ value: 50.404
3955
+ - type: ndcg_at_1
3956
+ value: 38.958
3957
+ - type: ndcg_at_10
3958
+ value: 39.367999999999995
3959
+ - type: ndcg_at_100
3960
+ value: 46.521
3961
+ - type: ndcg_at_1000
3962
+ value: 49.086999999999996
3963
+ - type: ndcg_at_3
3964
+ value: 33.442
3965
+ - type: ndcg_at_5
3966
+ value: 35.515
3967
+ - type: precision_at_1
3968
+ value: 38.958
3969
+ - type: precision_at_10
3970
+ value: 12.110999999999999
3971
+ - type: precision_at_100
3972
+ value: 1.982
3973
+ - type: precision_at_1000
3974
+ value: 0.247
3975
+ - type: precision_at_3
3976
+ value: 25.102999999999998
3977
+ - type: precision_at_5
3978
+ value: 18.971
3979
+ - type: recall_at_1
3980
+ value: 17.061
3981
+ - type: recall_at_10
3982
+ value: 45.198
3983
+ - type: recall_at_100
3984
+ value: 69.18900000000001
3985
+ - type: recall_at_1000
3986
+ value: 83.38499999999999
3987
+ - type: recall_at_3
3988
+ value: 30.241
3989
+ - type: recall_at_5
3990
+ value: 36.851
3991
+ - task:
3992
+ type: Retrieval
3993
+ dataset:
3994
+ type: mteb/dbpedia
3995
+ name: MTEB DBPedia
3996
+ config: default
3997
+ split: test
3998
+ revision: c0f706b76e590d620bd6618b3ca8efdd34e2d659
3999
+ metrics:
4000
+ - type: map_at_1
4001
+ value: 9.398
4002
+ - type: map_at_10
4003
+ value: 21.421
4004
+ - type: map_at_100
4005
+ value: 31.649
4006
+ - type: map_at_1000
4007
+ value: 33.469
4008
+ - type: map_at_3
4009
+ value: 15.310000000000002
4010
+ - type: map_at_5
4011
+ value: 17.946
4012
+ - type: mrr_at_1
4013
+ value: 71
4014
+ - type: mrr_at_10
4015
+ value: 78.92099999999999
4016
+ - type: mrr_at_100
4017
+ value: 79.225
4018
+ - type: mrr_at_1000
4019
+ value: 79.23
4020
+ - type: mrr_at_3
4021
+ value: 77.792
4022
+ - type: mrr_at_5
4023
+ value: 78.467
4024
+ - type: ndcg_at_1
4025
+ value: 57.99999999999999
4026
+ - type: ndcg_at_10
4027
+ value: 44.733000000000004
4028
+ - type: ndcg_at_100
4029
+ value: 50.646
4030
+ - type: ndcg_at_1000
4031
+ value: 57.903999999999996
4032
+ - type: ndcg_at_3
4033
+ value: 49.175999999999995
4034
+ - type: ndcg_at_5
4035
+ value: 46.800999999999995
4036
+ - type: precision_at_1
4037
+ value: 71
4038
+ - type: precision_at_10
4039
+ value: 36.25
4040
+ - type: precision_at_100
4041
+ value: 12.135
4042
+ - type: precision_at_1000
4043
+ value: 2.26
4044
+ - type: precision_at_3
4045
+ value: 52.75
4046
+ - type: precision_at_5
4047
+ value: 45.65
4048
+ - type: recall_at_1
4049
+ value: 9.398
4050
+ - type: recall_at_10
4051
+ value: 26.596999999999998
4052
+ - type: recall_at_100
4053
+ value: 57.943
4054
+ - type: recall_at_1000
4055
+ value: 81.147
4056
+ - type: recall_at_3
4057
+ value: 16.634
4058
+ - type: recall_at_5
4059
+ value: 20.7
4060
+ - task:
4061
+ type: Classification
4062
+ dataset:
4063
+ type: mteb/emotion
4064
+ name: MTEB EmotionClassification
4065
+ config: default
4066
+ split: test
4067
+ revision: 4f58c6b202a23cf9a4da393831edf4f9183cad37
4068
+ metrics:
4069
+ - type: accuracy
4070
+ value: 46.535000000000004
4071
+ - type: f1
4072
+ value: 42.53702746452163
4073
+ - task:
4074
+ type: Retrieval
4075
+ dataset:
4076
+ type: mteb/fever
4077
+ name: MTEB FEVER
4078
+ config: default
4079
+ split: test
4080
+ revision: bea83ef9e8fb933d90a2f1d5515737465d613e12
4081
+ metrics:
4082
+ - type: map_at_1
4083
+ value: 77.235
4084
+ - type: map_at_10
4085
+ value: 85.504
4086
+ - type: map_at_100
4087
+ value: 85.707
4088
+ - type: map_at_1000
4089
+ value: 85.718
4090
+ - type: map_at_3
4091
+ value: 84.425
4092
+ - type: map_at_5
4093
+ value: 85.13
4094
+ - type: mrr_at_1
4095
+ value: 83.363
4096
+ - type: mrr_at_10
4097
+ value: 89.916
4098
+ - type: mrr_at_100
4099
+ value: 89.955
4100
+ - type: mrr_at_1000
4101
+ value: 89.956
4102
+ - type: mrr_at_3
4103
+ value: 89.32600000000001
4104
+ - type: mrr_at_5
4105
+ value: 89.79
4106
+ - type: ndcg_at_1
4107
+ value: 83.363
4108
+ - type: ndcg_at_10
4109
+ value: 89.015
4110
+ - type: ndcg_at_100
4111
+ value: 89.649
4112
+ - type: ndcg_at_1000
4113
+ value: 89.825
4114
+ - type: ndcg_at_3
4115
+ value: 87.45100000000001
4116
+ - type: ndcg_at_5
4117
+ value: 88.39399999999999
4118
+ - type: precision_at_1
4119
+ value: 83.363
4120
+ - type: precision_at_10
4121
+ value: 10.659
4122
+ - type: precision_at_100
4123
+ value: 1.122
4124
+ - type: precision_at_1000
4125
+ value: 0.11499999999999999
4126
+ - type: precision_at_3
4127
+ value: 33.338
4128
+ - type: precision_at_5
4129
+ value: 20.671999999999997
4130
+ - type: recall_at_1
4131
+ value: 77.235
4132
+ - type: recall_at_10
4133
+ value: 95.389
4134
+ - type: recall_at_100
4135
+ value: 97.722
4136
+ - type: recall_at_1000
4137
+ value: 98.744
4138
+ - type: recall_at_3
4139
+ value: 91.19800000000001
4140
+ - type: recall_at_5
4141
+ value: 93.635
4142
+ - task:
4143
+ type: Retrieval
4144
+ dataset:
4145
+ type: mteb/fiqa
4146
+ name: MTEB FiQA2018
4147
+ config: default
4148
+ split: test
4149
+ revision: 27a168819829fe9bcd655c2df245fb19452e8e06
4150
+ metrics:
4151
+ - type: map_at_1
4152
+ value: 20.835
4153
+ - type: map_at_10
4154
+ value: 34.459
4155
+ - type: map_at_100
4156
+ value: 36.335
4157
+ - type: map_at_1000
4158
+ value: 36.518
4159
+ - type: map_at_3
4160
+ value: 30.581000000000003
4161
+ - type: map_at_5
4162
+ value: 32.859
4163
+ - type: mrr_at_1
4164
+ value: 40.894999999999996
4165
+ - type: mrr_at_10
4166
+ value: 50.491
4167
+ - type: mrr_at_100
4168
+ value: 51.243
4169
+ - type: mrr_at_1000
4170
+ value: 51.286
4171
+ - type: mrr_at_3
4172
+ value: 47.994
4173
+ - type: mrr_at_5
4174
+ value: 49.429
4175
+ - type: ndcg_at_1
4176
+ value: 40.894999999999996
4177
+ - type: ndcg_at_10
4178
+ value: 42.403
4179
+ - type: ndcg_at_100
4180
+ value: 48.954
4181
+ - type: ndcg_at_1000
4182
+ value: 51.961
4183
+ - type: ndcg_at_3
4184
+ value: 39.11
4185
+ - type: ndcg_at_5
4186
+ value: 40.152
4187
+ - type: precision_at_1
4188
+ value: 40.894999999999996
4189
+ - type: precision_at_10
4190
+ value: 11.466
4191
+ - type: precision_at_100
4192
+ value: 1.833
4193
+ - type: precision_at_1000
4194
+ value: 0.23700000000000002
4195
+ - type: precision_at_3
4196
+ value: 25.874000000000002
4197
+ - type: precision_at_5
4198
+ value: 19.012
4199
+ - type: recall_at_1
4200
+ value: 20.835
4201
+ - type: recall_at_10
4202
+ value: 49.535000000000004
4203
+ - type: recall_at_100
4204
+ value: 73.39099999999999
4205
+ - type: recall_at_1000
4206
+ value: 91.01599999999999
4207
+ - type: recall_at_3
4208
+ value: 36.379
4209
+ - type: recall_at_5
4210
+ value: 42.059999999999995
4211
+ - task:
4212
+ type: Retrieval
4213
+ dataset:
4214
+ type: mteb/hotpotqa
4215
+ name: MTEB HotpotQA
4216
+ config: default
4217
+ split: test
4218
+ revision: ab518f4d6fcca38d87c25209f94beba119d02014
4219
+ metrics:
4220
+ - type: map_at_1
4221
+ value: 40.945
4222
+ - type: map_at_10
4223
+ value: 65.376
4224
+ - type: map_at_100
4225
+ value: 66.278
4226
+ - type: map_at_1000
4227
+ value: 66.33
4228
+ - type: map_at_3
4229
+ value: 61.753
4230
+ - type: map_at_5
4231
+ value: 64.077
4232
+ - type: mrr_at_1
4233
+ value: 81.891
4234
+ - type: mrr_at_10
4235
+ value: 87.256
4236
+ - type: mrr_at_100
4237
+ value: 87.392
4238
+ - type: mrr_at_1000
4239
+ value: 87.395
4240
+ - type: mrr_at_3
4241
+ value: 86.442
4242
+ - type: mrr_at_5
4243
+ value: 86.991
4244
+ - type: ndcg_at_1
4245
+ value: 81.891
4246
+ - type: ndcg_at_10
4247
+ value: 73.654
4248
+ - type: ndcg_at_100
4249
+ value: 76.62299999999999
4250
+ - type: ndcg_at_1000
4251
+ value: 77.60000000000001
4252
+ - type: ndcg_at_3
4253
+ value: 68.71199999999999
4254
+ - type: ndcg_at_5
4255
+ value: 71.563
4256
+ - type: precision_at_1
4257
+ value: 81.891
4258
+ - type: precision_at_10
4259
+ value: 15.409
4260
+ - type: precision_at_100
4261
+ value: 1.77
4262
+ - type: precision_at_1000
4263
+ value: 0.19
4264
+ - type: precision_at_3
4265
+ value: 44.15
4266
+ - type: precision_at_5
4267
+ value: 28.732000000000003
4268
+ - type: recall_at_1
4269
+ value: 40.945
4270
+ - type: recall_at_10
4271
+ value: 77.04299999999999
4272
+ - type: recall_at_100
4273
+ value: 88.508
4274
+ - type: recall_at_1000
4275
+ value: 94.943
4276
+ - type: recall_at_3
4277
+ value: 66.226
4278
+ - type: recall_at_5
4279
+ value: 71.83
4280
+ - task:
4281
+ type: Classification
4282
+ dataset:
4283
+ type: mteb/imdb
4284
+ name: MTEB ImdbClassification
4285
+ config: default
4286
+ split: test
4287
+ revision: 3d86128a09e091d6018b6d26cad27f2739fc2db7
4288
+ metrics:
4289
+ - type: accuracy
4290
+ value: 74.08200000000001
4291
+ - type: ap
4292
+ value: 68.10929101713998
4293
+ - type: f1
4294
+ value: 73.98447117652009
4295
+ - task:
4296
+ type: Retrieval
4297
+ dataset:
4298
+ type: mteb/msmarco
4299
+ name: MTEB MSMARCO
4300
+ config: default
4301
+ split: dev
4302
+ revision: c5a29a104738b98a9e76336939199e264163d4a0
4303
+ metrics:
4304
+ - type: map_at_1
4305
+ value: 21.729000000000003
4306
+ - type: map_at_10
4307
+ value: 34.602
4308
+ - type: map_at_100
4309
+ value: 35.756
4310
+ - type: map_at_1000
4311
+ value: 35.803000000000004
4312
+ - type: map_at_3
4313
+ value: 30.619000000000003
4314
+ - type: map_at_5
4315
+ value: 32.914
4316
+ - type: mrr_at_1
4317
+ value: 22.364
4318
+ - type: mrr_at_10
4319
+ value: 35.183
4320
+ - type: mrr_at_100
4321
+ value: 36.287000000000006
4322
+ - type: mrr_at_1000
4323
+ value: 36.327999999999996
4324
+ - type: mrr_at_3
4325
+ value: 31.258000000000003
4326
+ - type: mrr_at_5
4327
+ value: 33.542
4328
+ - type: ndcg_at_1
4329
+ value: 22.364
4330
+ - type: ndcg_at_10
4331
+ value: 41.765
4332
+ - type: ndcg_at_100
4333
+ value: 47.293
4334
+ - type: ndcg_at_1000
4335
+ value: 48.457
4336
+ - type: ndcg_at_3
4337
+ value: 33.676
4338
+ - type: ndcg_at_5
4339
+ value: 37.783
4340
+ - type: precision_at_1
4341
+ value: 22.364
4342
+ - type: precision_at_10
4343
+ value: 6.662
4344
+ - type: precision_at_100
4345
+ value: 0.943
4346
+ - type: precision_at_1000
4347
+ value: 0.104
4348
+ - type: precision_at_3
4349
+ value: 14.435999999999998
4350
+ - type: precision_at_5
4351
+ value: 10.764999999999999
4352
+ - type: recall_at_1
4353
+ value: 21.729000000000003
4354
+ - type: recall_at_10
4355
+ value: 63.815999999999995
4356
+ - type: recall_at_100
4357
+ value: 89.265
4358
+ - type: recall_at_1000
4359
+ value: 98.149
4360
+ - type: recall_at_3
4361
+ value: 41.898
4362
+ - type: recall_at_5
4363
+ value: 51.76500000000001
4364
+ - task:
4365
+ type: Classification
4366
+ dataset:
4367
+ type: mteb/mtop_domain
4368
+ name: MTEB MTOPDomainClassification (en)
4369
+ config: en
4370
+ split: test
4371
+ revision: d80d48c1eb48d3562165c59d59d0034df9fff0bf
4372
+ metrics:
4373
+ - type: accuracy
4374
+ value: 92.73141814865483
4375
+ - type: f1
4376
+ value: 92.17518476408004
4377
+ - task:
4378
+ type: Classification
4379
+ dataset:
4380
+ type: mteb/mtop_intent
4381
+ name: MTEB MTOPIntentClassification (en)
4382
+ config: en
4383
+ split: test
4384
+ revision: ae001d0e6b1228650b7bd1c2c65fb50ad11a8aba
4385
+ metrics:
4386
+ - type: accuracy
4387
+ value: 65.18011855905152
4388
+ - type: f1
4389
+ value: 46.70999638311856
4390
+ - task:
4391
+ type: Classification
4392
+ dataset:
4393
+ type: masakhane/masakhanews
4394
+ name: MTEB MasakhaNEWSClassification (eng)
4395
+ config: eng
4396
+ split: test
4397
+ revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
4398
+ metrics:
4399
+ - type: accuracy
4400
+ value: 75.24261603375525
4401
+ - type: f1
4402
+ value: 74.07895183913367
4403
+ - task:
4404
+ type: Clustering
4405
+ dataset:
4406
+ type: masakhane/masakhanews
4407
+ name: MTEB MasakhaNEWSClusteringP2P (eng)
4408
+ config: eng
4409
+ split: test
4410
+ revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
4411
+ metrics:
4412
+ - type: v_measure
4413
+ value: 28.43855875387446
4414
+ - task:
4415
+ type: Clustering
4416
+ dataset:
4417
+ type: masakhane/masakhanews
4418
+ name: MTEB MasakhaNEWSClusteringS2S (eng)
4419
+ config: eng
4420
+ split: test
4421
+ revision: 8ccc72e69e65f40c70e117d8b3c08306bb788b60
4422
+ metrics:
4423
+ - type: v_measure
4424
+ value: 29.05331990256969
4425
+ - task:
4426
+ type: Classification
4427
+ dataset:
4428
+ type: mteb/amazon_massive_intent
4429
+ name: MTEB MassiveIntentClassification (en)
4430
+ config: en
4431
+ split: test
4432
+ revision: 31efe3c427b0bae9c22cbb560b8f15491cc6bed7
4433
+ metrics:
4434
+ - type: accuracy
4435
+ value: 66.92333557498318
4436
+ - type: f1
4437
+ value: 64.29789389602692
4438
+ - task:
4439
+ type: Classification
4440
+ dataset:
4441
+ type: mteb/amazon_massive_scenario
4442
+ name: MTEB MassiveScenarioClassification (en)
4443
+ config: en
4444
+ split: test
4445
+ revision: 7d571f92784cd94a019292a1f45445077d0ef634
4446
+ metrics:
4447
+ - type: accuracy
4448
+ value: 72.74714189643578
4449
+ - type: f1
4450
+ value: 71.672585608315
4451
+ - task:
4452
+ type: Clustering
4453
+ dataset:
4454
+ type: mteb/medrxiv-clustering-p2p
4455
+ name: MTEB MedrxivClusteringP2P
4456
+ config: default
4457
+ split: test
4458
+ revision: e7a26af6f3ae46b30dde8737f02c07b1505bcc73
4459
+ metrics:
4460
+ - type: v_measure
4461
+ value: 31.503564225501613
4462
+ - task:
4463
+ type: Clustering
4464
+ dataset:
4465
+ type: mteb/medrxiv-clustering-s2s
4466
+ name: MTEB MedrxivClusteringS2S
4467
+ config: default
4468
+ split: test
4469
+ revision: 35191c8c0dca72d8ff3efcd72aa802307d469663
4470
+ metrics:
4471
+ - type: v_measure
4472
+ value: 28.410225127136457
4473
+ - task:
4474
+ type: Reranking
4475
+ dataset:
4476
+ type: mteb/mind_small
4477
+ name: MTEB MindSmallReranking
4478
+ config: default
4479
+ split: test
4480
+ revision: 3bdac13927fdc888b903db93b2ffdbd90b295a69
4481
+ metrics:
4482
+ - type: map
4483
+ value: 29.170019896091908
4484
+ - type: mrr
4485
+ value: 29.881276831500976
4486
+ - task:
4487
+ type: Retrieval
4488
+ dataset:
4489
+ type: mteb/nfcorpus
4490
+ name: MTEB NFCorpus
4491
+ config: default
4492
+ split: test
4493
+ revision: ec0fa4fe99da2ff19ca1214b7966684033a58814
4494
+ metrics:
4495
+ - type: map_at_1
4496
+ value: 6.544
4497
+ - type: map_at_10
4498
+ value: 14.116999999999999
4499
+ - type: map_at_100
4500
+ value: 17.522
4501
+ - type: map_at_1000
4502
+ value: 19
4503
+ - type: map_at_3
4504
+ value: 10.369
4505
+ - type: map_at_5
4506
+ value: 12.189
4507
+ - type: mrr_at_1
4508
+ value: 47.988
4509
+ - type: mrr_at_10
4510
+ value: 56.84
4511
+ - type: mrr_at_100
4512
+ value: 57.367000000000004
4513
+ - type: mrr_at_1000
4514
+ value: 57.403000000000006
4515
+ - type: mrr_at_3
4516
+ value: 54.592
4517
+ - type: mrr_at_5
4518
+ value: 56.233
4519
+ - type: ndcg_at_1
4520
+ value: 45.82
4521
+ - type: ndcg_at_10
4522
+ value: 36.767
4523
+ - type: ndcg_at_100
4524
+ value: 33.356
4525
+ - type: ndcg_at_1000
4526
+ value: 42.062
4527
+ - type: ndcg_at_3
4528
+ value: 42.15
4529
+ - type: ndcg_at_5
4530
+ value: 40.355000000000004
4531
+ - type: precision_at_1
4532
+ value: 47.988
4533
+ - type: precision_at_10
4534
+ value: 27.121000000000002
4535
+ - type: precision_at_100
4536
+ value: 8.455
4537
+ - type: precision_at_1000
4538
+ value: 2.103
4539
+ - type: precision_at_3
4540
+ value: 39.628
4541
+ - type: precision_at_5
4542
+ value: 35.356
4543
+ - type: recall_at_1
4544
+ value: 6.544
4545
+ - type: recall_at_10
4546
+ value: 17.928
4547
+ - type: recall_at_100
4548
+ value: 32.843
4549
+ - type: recall_at_1000
4550
+ value: 65.752
4551
+ - type: recall_at_3
4552
+ value: 11.297
4553
+ - type: recall_at_5
4554
+ value: 14.357000000000001
4555
+ - task:
4556
+ type: Retrieval
4557
+ dataset:
4558
+ type: mteb/nq
4559
+ name: MTEB NQ
4560
+ config: default
4561
+ split: test
4562
+ revision: b774495ed302d8c44a3a7ea25c90dbce03968f31
4563
+ metrics:
4564
+ - type: map_at_1
4565
+ value: 39.262
4566
+ - type: map_at_10
4567
+ value: 55.095000000000006
4568
+ - type: map_at_100
4569
+ value: 55.93900000000001
4570
+ - type: map_at_1000
4571
+ value: 55.955999999999996
4572
+ - type: map_at_3
4573
+ value: 50.93
4574
+ - type: map_at_5
4575
+ value: 53.491
4576
+ - type: mrr_at_1
4577
+ value: 43.598
4578
+ - type: mrr_at_10
4579
+ value: 57.379999999999995
4580
+ - type: mrr_at_100
4581
+ value: 57.940999999999995
4582
+ - type: mrr_at_1000
4583
+ value: 57.952000000000005
4584
+ - type: mrr_at_3
4585
+ value: 53.998000000000005
4586
+ - type: mrr_at_5
4587
+ value: 56.128
4588
+ - type: ndcg_at_1
4589
+ value: 43.598
4590
+ - type: ndcg_at_10
4591
+ value: 62.427
4592
+ - type: ndcg_at_100
4593
+ value: 65.759
4594
+ - type: ndcg_at_1000
4595
+ value: 66.133
4596
+ - type: ndcg_at_3
4597
+ value: 54.745999999999995
4598
+ - type: ndcg_at_5
4599
+ value: 58.975
4600
+ - type: precision_at_1
4601
+ value: 43.598
4602
+ - type: precision_at_10
4603
+ value: 9.789
4604
+ - type: precision_at_100
4605
+ value: 1.171
4606
+ - type: precision_at_1000
4607
+ value: 0.121
4608
+ - type: precision_at_3
4609
+ value: 24.295
4610
+ - type: precision_at_5
4611
+ value: 17.028
4612
+ - type: recall_at_1
4613
+ value: 39.262
4614
+ - type: recall_at_10
4615
+ value: 82.317
4616
+ - type: recall_at_100
4617
+ value: 96.391
4618
+ - type: recall_at_1000
4619
+ value: 99.116
4620
+ - type: recall_at_3
4621
+ value: 62.621
4622
+ - type: recall_at_5
4623
+ value: 72.357
4624
+ - task:
4625
+ type: Classification
4626
+ dataset:
4627
+ type: ag_news
4628
+ name: MTEB NewsClassification
4629
+ config: default
4630
+ split: test
4631
+ revision: eb185aade064a813bc0b7f42de02595523103ca4
4632
+ metrics:
4633
+ - type: accuracy
4634
+ value: 78.17500000000001
4635
+ - type: f1
4636
+ value: 78.01940892857273
4637
+ - task:
4638
+ type: PairClassification
4639
+ dataset:
4640
+ type: GEM/opusparcus
4641
+ name: MTEB OpusparcusPC (en)
4642
+ config: en
4643
+ split: test
4644
+ revision: 9e9b1f8ef51616073f47f306f7f47dd91663f86a
4645
+ metrics:
4646
+ - type: cos_sim_accuracy
4647
+ value: 99.89816700610999
4648
+ - type: cos_sim_ap
4649
+ value: 100
4650
+ - type: cos_sim_f1
4651
+ value: 99.9490575649516
4652
+ - type: cos_sim_precision
4653
+ value: 100
4654
+ - type: cos_sim_recall
4655
+ value: 99.89816700610999
4656
+ - type: dot_accuracy
4657
+ value: 99.89816700610999
4658
+ - type: dot_ap
4659
+ value: 100
4660
+ - type: dot_f1
4661
+ value: 99.9490575649516
4662
+ - type: dot_precision
4663
+ value: 100
4664
+ - type: dot_recall
4665
+ value: 99.89816700610999
4666
+ - type: euclidean_accuracy
4667
+ value: 99.89816700610999
4668
+ - type: euclidean_ap
4669
+ value: 100
4670
+ - type: euclidean_f1
4671
+ value: 99.9490575649516
4672
+ - type: euclidean_precision
4673
+ value: 100
4674
+ - type: euclidean_recall
4675
+ value: 99.89816700610999
4676
+ - type: manhattan_accuracy
4677
+ value: 99.89816700610999
4678
+ - type: manhattan_ap
4679
+ value: 100
4680
+ - type: manhattan_f1
4681
+ value: 99.9490575649516
4682
+ - type: manhattan_precision
4683
+ value: 100
4684
+ - type: manhattan_recall
4685
+ value: 99.89816700610999
4686
+ - type: max_accuracy
4687
+ value: 99.89816700610999
4688
+ - type: max_ap
4689
+ value: 100
4690
+ - type: max_f1
4691
+ value: 99.9490575649516
4692
+ - task:
4693
+ type: PairClassification
4694
+ dataset:
4695
+ type: paws-x
4696
+ name: MTEB PawsX (en)
4697
+ config: en
4698
+ split: test
4699
+ revision: 8a04d940a42cd40658986fdd8e3da561533a3646
4700
+ metrics:
4701
+ - type: cos_sim_accuracy
4702
+ value: 61
4703
+ - type: cos_sim_ap
4704
+ value: 59.630757252602464
4705
+ - type: cos_sim_f1
4706
+ value: 62.37521514629949
4707
+ - type: cos_sim_precision
4708
+ value: 45.34534534534534
4709
+ - type: cos_sim_recall
4710
+ value: 99.88974641675854
4711
+ - type: dot_accuracy
4712
+ value: 61
4713
+ - type: dot_ap
4714
+ value: 59.631527308059006
4715
+ - type: dot_f1
4716
+ value: 62.37521514629949
4717
+ - type: dot_precision
4718
+ value: 45.34534534534534
4719
+ - type: dot_recall
4720
+ value: 99.88974641675854
4721
+ - type: euclidean_accuracy
4722
+ value: 61
4723
+ - type: euclidean_ap
4724
+ value: 59.630757252602464
4725
+ - type: euclidean_f1
4726
+ value: 62.37521514629949
4727
+ - type: euclidean_precision
4728
+ value: 45.34534534534534
4729
+ - type: euclidean_recall
4730
+ value: 99.88974641675854
4731
+ - type: manhattan_accuracy
4732
+ value: 60.9
4733
+ - type: manhattan_ap
4734
+ value: 59.613947780462254
4735
+ - type: manhattan_f1
4736
+ value: 62.37521514629949
4737
+ - type: manhattan_precision
4738
+ value: 45.34534534534534
4739
+ - type: manhattan_recall
4740
+ value: 99.88974641675854
4741
+ - type: max_accuracy
4742
+ value: 61
4743
+ - type: max_ap
4744
+ value: 59.631527308059006
4745
+ - type: max_f1
4746
+ value: 62.37521514629949
4747
+ - task:
4748
+ type: Retrieval
4749
+ dataset:
4750
+ type: mteb/quora
4751
+ name: MTEB QuoraRetrieval
4752
+ config: default
4753
+ split: test
4754
+ revision: e4e08e0b7dbe3c8700f0daef558ff32256715259
4755
+ metrics:
4756
+ - type: map_at_1
4757
+ value: 69.963
4758
+ - type: map_at_10
4759
+ value: 83.59400000000001
4760
+ - type: map_at_100
4761
+ value: 84.236
4762
+ - type: map_at_1000
4763
+ value: 84.255
4764
+ - type: map_at_3
4765
+ value: 80.69800000000001
4766
+ - type: map_at_5
4767
+ value: 82.568
4768
+ - type: mrr_at_1
4769
+ value: 80.58999999999999
4770
+ - type: mrr_at_10
4771
+ value: 86.78200000000001
4772
+ - type: mrr_at_100
4773
+ value: 86.89099999999999
4774
+ - type: mrr_at_1000
4775
+ value: 86.893
4776
+ - type: mrr_at_3
4777
+ value: 85.757
4778
+ - type: mrr_at_5
4779
+ value: 86.507
4780
+ - type: ndcg_at_1
4781
+ value: 80.60000000000001
4782
+ - type: ndcg_at_10
4783
+ value: 87.41799999999999
4784
+ - type: ndcg_at_100
4785
+ value: 88.723
4786
+ - type: ndcg_at_1000
4787
+ value: 88.875
4788
+ - type: ndcg_at_3
4789
+ value: 84.565
4790
+ - type: ndcg_at_5
4791
+ value: 86.236
4792
+ - type: precision_at_1
4793
+ value: 80.60000000000001
4794
+ - type: precision_at_10
4795
+ value: 13.239
4796
+ - type: precision_at_100
4797
+ value: 1.5150000000000001
4798
+ - type: precision_at_1000
4799
+ value: 0.156
4800
+ - type: precision_at_3
4801
+ value: 36.947
4802
+ - type: precision_at_5
4803
+ value: 24.354
4804
+ - type: recall_at_1
4805
+ value: 69.963
4806
+ - type: recall_at_10
4807
+ value: 94.553
4808
+ - type: recall_at_100
4809
+ value: 99.104
4810
+ - type: recall_at_1000
4811
+ value: 99.872
4812
+ - type: recall_at_3
4813
+ value: 86.317
4814
+ - type: recall_at_5
4815
+ value: 91.023
4816
+ - task:
4817
+ type: Clustering
4818
+ dataset:
4819
+ type: mteb/reddit-clustering
4820
+ name: MTEB RedditClustering
4821
+ config: default
4822
+ split: test
4823
+ revision: 24640382cdbf8abc73003fb0fa6d111a705499eb
4824
+ metrics:
4825
+ - type: v_measure
4826
+ value: 47.52890410998761
4827
+ - task:
4828
+ type: Clustering
4829
+ dataset:
4830
+ type: mteb/reddit-clustering-p2p
4831
+ name: MTEB RedditClusteringP2P
4832
+ config: default
4833
+ split: test
4834
+ revision: 385e3cb46b4cfa89021f56c4380204149d0efe33
4835
+ metrics:
4836
+ - type: v_measure
4837
+ value: 62.760692287940486
4838
+ - task:
4839
+ type: Retrieval
4840
+ dataset:
4841
+ type: mteb/scidocs
4842
+ name: MTEB SCIDOCS
4843
+ config: default
4844
+ split: test
4845
+ revision: f8c2fcf00f625baaa80f62ec5bd9e1fff3b8ae88
4846
+ metrics:
4847
+ - type: map_at_1
4848
+ value: 5.093
4849
+ - type: map_at_10
4850
+ value: 12.695
4851
+ - type: map_at_100
4852
+ value: 14.824000000000002
4853
+ - type: map_at_1000
4854
+ value: 15.123000000000001
4855
+ - type: map_at_3
4856
+ value: 8.968
4857
+ - type: map_at_5
4858
+ value: 10.828
4859
+ - type: mrr_at_1
4860
+ value: 25.1
4861
+ - type: mrr_at_10
4862
+ value: 35.894999999999996
4863
+ - type: mrr_at_100
4864
+ value: 36.966
4865
+ - type: mrr_at_1000
4866
+ value: 37.019999999999996
4867
+ - type: mrr_at_3
4868
+ value: 32.467
4869
+ - type: mrr_at_5
4870
+ value: 34.416999999999994
4871
+ - type: ndcg_at_1
4872
+ value: 25.1
4873
+ - type: ndcg_at_10
4874
+ value: 21.096999999999998
4875
+ - type: ndcg_at_100
4876
+ value: 29.202
4877
+ - type: ndcg_at_1000
4878
+ value: 34.541
4879
+ - type: ndcg_at_3
4880
+ value: 19.875
4881
+ - type: ndcg_at_5
4882
+ value: 17.497
4883
+ - type: precision_at_1
4884
+ value: 25.1
4885
+ - type: precision_at_10
4886
+ value: 10.9
4887
+ - type: precision_at_100
4888
+ value: 2.255
4889
+ - type: precision_at_1000
4890
+ value: 0.35400000000000004
4891
+ - type: precision_at_3
4892
+ value: 18.367
4893
+ - type: precision_at_5
4894
+ value: 15.299999999999999
4895
+ - type: recall_at_1
4896
+ value: 5.093
4897
+ - type: recall_at_10
4898
+ value: 22.092
4899
+ - type: recall_at_100
4900
+ value: 45.778
4901
+ - type: recall_at_1000
4902
+ value: 71.985
4903
+ - type: recall_at_3
4904
+ value: 11.167
4905
+ - type: recall_at_5
4906
+ value: 15.501999999999999
4907
+ - task:
4908
+ type: STS
4909
+ dataset:
4910
+ type: mteb/sickr-sts
4911
+ name: MTEB SICK-R
4912
+ config: default
4913
+ split: test
4914
+ revision: 20a6d6f312dd54037fe07a32d58e5e168867909d
4915
+ metrics:
4916
+ - type: cos_sim_pearson
4917
+ value: 74.04386981759481
4918
+ - type: cos_sim_spearman
4919
+ value: 69.12484963763646
4920
+ - type: euclidean_pearson
4921
+ value: 71.49384353291062
4922
+ - type: euclidean_spearman
4923
+ value: 69.12484548317074
4924
+ - type: manhattan_pearson
4925
+ value: 71.49828173987272
4926
+ - type: manhattan_spearman
4927
+ value: 69.08350274367014
4928
+ - task:
4929
+ type: STS
4930
+ dataset:
4931
+ type: mteb/sts12-sts
4932
+ name: MTEB STS12
4933
+ config: default
4934
+ split: test
4935
+ revision: a0d554a64d88156834ff5ae9920b964011b16384
4936
+ metrics:
4937
+ - type: cos_sim_pearson
4938
+ value: 66.95372527615659
4939
+ - type: cos_sim_spearman
4940
+ value: 66.96821894433991
4941
+ - type: euclidean_pearson
4942
+ value: 64.675348002074
4943
+ - type: euclidean_spearman
4944
+ value: 66.96821894433991
4945
+ - type: manhattan_pearson
4946
+ value: 64.5965887073831
4947
+ - type: manhattan_spearman
4948
+ value: 66.88569076794741
4949
+ - task:
4950
+ type: STS
4951
+ dataset:
4952
+ type: mteb/sts13-sts
4953
+ name: MTEB STS13
4954
+ config: default
4955
+ split: test
4956
+ revision: 7e90230a92c190f1bf69ae9002b8cea547a64cca
4957
+ metrics:
4958
+ - type: cos_sim_pearson
4959
+ value: 77.34698437961983
4960
+ - type: cos_sim_spearman
4961
+ value: 79.1153001117325
4962
+ - type: euclidean_pearson
4963
+ value: 78.53562874696966
4964
+ - type: euclidean_spearman
4965
+ value: 79.11530018205724
4966
+ - type: manhattan_pearson
4967
+ value: 78.46484988944093
4968
+ - type: manhattan_spearman
4969
+ value: 79.01416027493104
4970
+ - task:
4971
+ type: STS
4972
+ dataset:
4973
+ type: mteb/sts14-sts
4974
+ name: MTEB STS14
4975
+ config: default
4976
+ split: test
4977
+ revision: 6031580fec1f6af667f0bd2da0a551cf4f0b2375
4978
+ metrics:
4979
+ - type: cos_sim_pearson
4980
+ value: 68.81220371935373
4981
+ - type: cos_sim_spearman
4982
+ value: 68.50538405089604
4983
+ - type: euclidean_pearson
4984
+ value: 68.69204272683749
4985
+ - type: euclidean_spearman
4986
+ value: 68.50534223912419
4987
+ - type: manhattan_pearson
4988
+ value: 68.67300120149523
4989
+ - type: manhattan_spearman
4990
+ value: 68.45404301623115
4991
+ - task:
4992
+ type: STS
4993
+ dataset:
4994
+ type: mteb/sts15-sts
4995
+ name: MTEB STS15
4996
+ config: default
4997
+ split: test
4998
+ revision: ae752c7c21bf194d8b67fd573edf7ae58183cbe3
4999
+ metrics:
5000
+ - type: cos_sim_pearson
5001
+ value: 78.2464678879813
5002
+ - type: cos_sim_spearman
5003
+ value: 79.92003940566667
5004
+ - type: euclidean_pearson
5005
+ value: 79.8080778793964
5006
+ - type: euclidean_spearman
5007
+ value: 79.92003940566667
5008
+ - type: manhattan_pearson
5009
+ value: 79.80153621444681
5010
+ - type: manhattan_spearman
5011
+ value: 79.91293261418134
5012
+ - task:
5013
+ type: STS
5014
+ dataset:
5015
+ type: mteb/sts16-sts
5016
+ name: MTEB STS16
5017
+ config: default
5018
+ split: test
5019
+ revision: 4d8694f8f0e0100860b497b999b3dbed754a0513
5020
+ metrics:
5021
+ - type: cos_sim_pearson
5022
+ value: 76.31179207708662
5023
+ - type: cos_sim_spearman
5024
+ value: 78.65597349856115
5025
+ - type: euclidean_pearson
5026
+ value: 78.76937027472678
5027
+ - type: euclidean_spearman
5028
+ value: 78.65597349856115
5029
+ - type: manhattan_pearson
5030
+ value: 78.77129513300605
5031
+ - type: manhattan_spearman
5032
+ value: 78.62640467680775
5033
+ - task:
5034
+ type: STS
5035
+ dataset:
5036
+ type: mteb/sts17-crosslingual-sts
5037
+ name: MTEB STS17 (en-en)
5038
+ config: en-en
5039
+ split: test
5040
+ revision: af5e6fb845001ecf41f4c1e033ce921939a2a68d
5041
+ metrics:
5042
+ - type: cos_sim_pearson
5043
+ value: 79.43158429552561
5044
+ - type: cos_sim_spearman
5045
+ value: 81.46108646565362
5046
+ - type: euclidean_pearson
5047
+ value: 81.47071791452292
5048
+ - type: euclidean_spearman
5049
+ value: 81.46108646565362
5050
+ - type: manhattan_pearson
5051
+ value: 81.56920643846031
5052
+ - type: manhattan_spearman
5053
+ value: 81.42226241399516
5054
+ - task:
5055
+ type: STS
5056
+ dataset:
5057
+ type: mteb/sts22-crosslingual-sts
5058
+ name: MTEB STS22 (en)
5059
+ config: en
5060
+ split: test
5061
+ revision: eea2b4fe26a775864c896887d910b76a8098ad3f
5062
+ metrics:
5063
+ - type: cos_sim_pearson
5064
+ value: 66.89546474141514
5065
+ - type: cos_sim_spearman
5066
+ value: 65.8393752170531
5067
+ - type: euclidean_pearson
5068
+ value: 67.2580522762307
5069
+ - type: euclidean_spearman
5070
+ value: 65.8393752170531
5071
+ - type: manhattan_pearson
5072
+ value: 67.45157729300522
5073
+ - type: manhattan_spearman
5074
+ value: 66.19470854403802
5075
+ - task:
5076
+ type: STS
5077
+ dataset:
5078
+ type: mteb/stsbenchmark-sts
5079
+ name: MTEB STSBenchmark
5080
+ config: default
5081
+ split: test
5082
+ revision: b0fddb56ed78048fa8b90373c8a3cfc37b684831
5083
+ metrics:
5084
+ - type: cos_sim_pearson
5085
+ value: 71.39566306334434
5086
+ - type: cos_sim_spearman
5087
+ value: 74.0981396086974
5088
+ - type: euclidean_pearson
5089
+ value: 73.7834496259745
5090
+ - type: euclidean_spearman
5091
+ value: 74.09803741302046
5092
+ - type: manhattan_pearson
5093
+ value: 73.79958138780945
5094
+ - type: manhattan_spearman
5095
+ value: 74.09894837555905
5096
+ - task:
5097
+ type: STS
5098
+ dataset:
5099
+ type: PhilipMay/stsb_multi_mt
5100
+ name: MTEB STSBenchmarkMultilingualSTS (en)
5101
+ config: en
5102
+ split: test
5103
+ revision: 93d57ef91790589e3ce9c365164337a8a78b7632
5104
+ metrics:
5105
+ - type: cos_sim_pearson
5106
+ value: 71.39566311006806
5107
+ - type: cos_sim_spearman
5108
+ value: 74.0981396086974
5109
+ - type: euclidean_pearson
5110
+ value: 73.78344970897099
5111
+ - type: euclidean_spearman
5112
+ value: 74.09803741302046
5113
+ - type: manhattan_pearson
5114
+ value: 73.79958147136705
5115
+ - type: manhattan_spearman
5116
+ value: 74.09894837555905
5117
+ - task:
5118
+ type: Reranking
5119
+ dataset:
5120
+ type: mteb/scidocs-reranking
5121
+ name: MTEB SciDocsRR
5122
+ config: default
5123
+ split: test
5124
+ revision: d3c5e1fc0b855ab6097bf1cda04dd73947d7caab
5125
+ metrics:
5126
+ - type: map
5127
+ value: 80.81059564334683
5128
+ - type: mrr
5129
+ value: 94.62696617108381
5130
+ - task:
5131
+ type: Retrieval
5132
+ dataset:
5133
+ type: mteb/scifact
5134
+ name: MTEB SciFact
5135
+ config: default
5136
+ split: test
5137
+ revision: 0228b52cf27578f30900b9e5271d331663a030d7
5138
+ metrics:
5139
+ - type: map_at_1
5140
+ value: 57.760999999999996
5141
+ - type: map_at_10
5142
+ value: 68.614
5143
+ - type: map_at_100
5144
+ value: 69.109
5145
+ - type: map_at_1000
5146
+ value: 69.134
5147
+ - type: map_at_3
5148
+ value: 65.735
5149
+ - type: map_at_5
5150
+ value: 67.42099999999999
5151
+ - type: mrr_at_1
5152
+ value: 60.667
5153
+ - type: mrr_at_10
5154
+ value: 69.94200000000001
5155
+ - type: mrr_at_100
5156
+ value: 70.254
5157
+ - type: mrr_at_1000
5158
+ value: 70.28
5159
+ - type: mrr_at_3
5160
+ value: 67.72200000000001
5161
+ - type: mrr_at_5
5162
+ value: 69.18900000000001
5163
+ - type: ndcg_at_1
5164
+ value: 60.667
5165
+ - type: ndcg_at_10
5166
+ value: 73.548
5167
+ - type: ndcg_at_100
5168
+ value: 75.381
5169
+ - type: ndcg_at_1000
5170
+ value: 75.991
5171
+ - type: ndcg_at_3
5172
+ value: 68.685
5173
+ - type: ndcg_at_5
5174
+ value: 71.26
5175
+ - type: precision_at_1
5176
+ value: 60.667
5177
+ - type: precision_at_10
5178
+ value: 9.833
5179
+ - type: precision_at_100
5180
+ value: 1.08
5181
+ - type: precision_at_1000
5182
+ value: 0.11299999999999999
5183
+ - type: precision_at_3
5184
+ value: 26.889000000000003
5185
+ - type: precision_at_5
5186
+ value: 17.8
5187
+ - type: recall_at_1
5188
+ value: 57.760999999999996
5189
+ - type: recall_at_10
5190
+ value: 87.13300000000001
5191
+ - type: recall_at_100
5192
+ value: 95
5193
+ - type: recall_at_1000
5194
+ value: 99.667
5195
+ - type: recall_at_3
5196
+ value: 74.211
5197
+ - type: recall_at_5
5198
+ value: 80.63900000000001
5199
+ - task:
5200
+ type: PairClassification
5201
+ dataset:
5202
+ type: mteb/sprintduplicatequestions-pairclassification
5203
+ name: MTEB SprintDuplicateQuestions
5204
+ config: default
5205
+ split: test
5206
+ revision: d66bd1f72af766a5cc4b0ca5e00c162f89e8cc46
5207
+ metrics:
5208
+ - type: cos_sim_accuracy
5209
+ value: 99.81881188118813
5210
+ - type: cos_sim_ap
5211
+ value: 95.21196473745837
5212
+ - type: cos_sim_f1
5213
+ value: 90.69767441860465
5214
+ - type: cos_sim_precision
5215
+ value: 91.71779141104295
5216
+ - type: cos_sim_recall
5217
+ value: 89.7
5218
+ - type: dot_accuracy
5219
+ value: 99.81881188118813
5220
+ - type: dot_ap
5221
+ value: 95.21196473745837
5222
+ - type: dot_f1
5223
+ value: 90.69767441860465
5224
+ - type: dot_precision
5225
+ value: 91.71779141104295
5226
+ - type: dot_recall
5227
+ value: 89.7
5228
+ - type: euclidean_accuracy
5229
+ value: 99.81881188118813
5230
+ - type: euclidean_ap
5231
+ value: 95.21196473745839
5232
+ - type: euclidean_f1
5233
+ value: 90.69767441860465
5234
+ - type: euclidean_precision
5235
+ value: 91.71779141104295
5236
+ - type: euclidean_recall
5237
+ value: 89.7
5238
+ - type: manhattan_accuracy
5239
+ value: 99.81287128712871
5240
+ - type: manhattan_ap
5241
+ value: 95.16667174835017
5242
+ - type: manhattan_f1
5243
+ value: 90.41095890410959
5244
+ - type: manhattan_precision
5245
+ value: 91.7610710607621
5246
+ - type: manhattan_recall
5247
+ value: 89.1
5248
+ - type: max_accuracy
5249
+ value: 99.81881188118813
5250
+ - type: max_ap
5251
+ value: 95.21196473745839
5252
+ - type: max_f1
5253
+ value: 90.69767441860465
5254
+ - task:
5255
+ type: Clustering
5256
+ dataset:
5257
+ type: mteb/stackexchange-clustering
5258
+ name: MTEB StackExchangeClustering
5259
+ config: default
5260
+ split: test
5261
+ revision: 6cbc1f7b2bc0622f2e39d2c77fa502909748c259
5262
+ metrics:
5263
+ - type: v_measure
5264
+ value: 59.54942204515638
5265
+ - task:
5266
+ type: Clustering
5267
+ dataset:
5268
+ type: mteb/stackexchange-clustering-p2p
5269
+ name: MTEB StackExchangeClusteringP2P
5270
+ config: default
5271
+ split: test
5272
+ revision: 815ca46b2622cec33ccafc3735d572c266efdb44
5273
+ metrics:
5274
+ - type: v_measure
5275
+ value: 39.42892282672948
5276
+ - task:
5277
+ type: Reranking
5278
+ dataset:
5279
+ type: mteb/stackoverflowdupquestions-reranking
5280
+ name: MTEB StackOverflowDupQuestions
5281
+ config: default
5282
+ split: test
5283
+ revision: e185fbe320c72810689fc5848eb6114e1ef5ec69
5284
+ metrics:
5285
+ - type: map
5286
+ value: 51.189033075914324
5287
+ - type: mrr
5288
+ value: 51.97014790764791
5289
+ - task:
5290
+ type: Summarization
5291
+ dataset:
5292
+ type: mteb/summeval
5293
+ name: MTEB SummEval
5294
+ config: default
5295
+ split: test
5296
+ revision: cda12ad7615edc362dbf25a00fdd61d3b1eaf93c
5297
+ metrics:
5298
+ - type: cos_sim_pearson
5299
+ value: 30.09466569775977
5300
+ - type: cos_sim_spearman
5301
+ value: 30.31058660775912
5302
+ - type: dot_pearson
5303
+ value: 30.09466438861689
5304
+ - type: dot_spearman
5305
+ value: 30.31058660775912
5306
+ - task:
5307
+ type: Retrieval
5308
+ dataset:
5309
+ type: mteb/trec-covid
5310
+ name: MTEB TRECCOVID
5311
+ config: default
5312
+ split: test
5313
+ revision: bb9466bac8153a0349341eb1b22e06409e78ef4e
5314
+ metrics:
5315
+ - type: map_at_1
5316
+ value: 0.253
5317
+ - type: map_at_10
5318
+ value: 2.07
5319
+ - type: map_at_100
5320
+ value: 12.679000000000002
5321
+ - type: map_at_1000
5322
+ value: 30.412
5323
+ - type: map_at_3
5324
+ value: 0.688
5325
+ - type: map_at_5
5326
+ value: 1.079
5327
+ - type: mrr_at_1
5328
+ value: 96
5329
+ - type: mrr_at_10
5330
+ value: 98
5331
+ - type: mrr_at_100
5332
+ value: 98
5333
+ - type: mrr_at_1000
5334
+ value: 98
5335
+ - type: mrr_at_3
5336
+ value: 98
5337
+ - type: mrr_at_5
5338
+ value: 98
5339
+ - type: ndcg_at_1
5340
+ value: 89
5341
+ - type: ndcg_at_10
5342
+ value: 79.646
5343
+ - type: ndcg_at_100
5344
+ value: 62.217999999999996
5345
+ - type: ndcg_at_1000
5346
+ value: 55.13400000000001
5347
+ - type: ndcg_at_3
5348
+ value: 83.458
5349
+ - type: ndcg_at_5
5350
+ value: 80.982
5351
+ - type: precision_at_1
5352
+ value: 96
5353
+ - type: precision_at_10
5354
+ value: 84.6
5355
+ - type: precision_at_100
5356
+ value: 64.34
5357
+ - type: precision_at_1000
5358
+ value: 24.534
5359
+ - type: precision_at_3
5360
+ value: 88.667
5361
+ - type: precision_at_5
5362
+ value: 85.6
5363
+ - type: recall_at_1
5364
+ value: 0.253
5365
+ - type: recall_at_10
5366
+ value: 2.253
5367
+ - type: recall_at_100
5368
+ value: 15.606
5369
+ - type: recall_at_1000
5370
+ value: 51.595
5371
+ - type: recall_at_3
5372
+ value: 0.7100000000000001
5373
+ - type: recall_at_5
5374
+ value: 1.139
5375
+ - task:
5376
+ type: Retrieval
5377
+ dataset:
5378
+ type: mteb/touche2020
5379
+ name: MTEB Touche2020
5380
+ config: default
5381
+ split: test
5382
+ revision: a34f9a33db75fa0cbb21bb5cfc3dae8dc8bec93f
5383
+ metrics:
5384
+ - type: map_at_1
5385
+ value: 3.0540000000000003
5386
+ - type: map_at_10
5387
+ value: 13.078999999999999
5388
+ - type: map_at_100
5389
+ value: 19.468
5390
+ - type: map_at_1000
5391
+ value: 21.006
5392
+ - type: map_at_3
5393
+ value: 6.8629999999999995
5394
+ - type: map_at_5
5395
+ value: 9.187
5396
+ - type: mrr_at_1
5397
+ value: 42.857
5398
+ - type: mrr_at_10
5399
+ value: 56.735
5400
+ - type: mrr_at_100
5401
+ value: 57.352000000000004
5402
+ - type: mrr_at_1000
5403
+ value: 57.352000000000004
5404
+ - type: mrr_at_3
5405
+ value: 52.721
5406
+ - type: mrr_at_5
5407
+ value: 54.66
5408
+ - type: ndcg_at_1
5409
+ value: 38.775999999999996
5410
+ - type: ndcg_at_10
5411
+ value: 31.469
5412
+ - type: ndcg_at_100
5413
+ value: 42.016999999999996
5414
+ - type: ndcg_at_1000
5415
+ value: 52.60399999999999
5416
+ - type: ndcg_at_3
5417
+ value: 35.894
5418
+ - type: ndcg_at_5
5419
+ value: 33.873
5420
+ - type: precision_at_1
5421
+ value: 42.857
5422
+ - type: precision_at_10
5423
+ value: 27.346999999999998
5424
+ - type: precision_at_100
5425
+ value: 8.327
5426
+ - type: precision_at_1000
5427
+ value: 1.551
5428
+ - type: precision_at_3
5429
+ value: 36.735
5430
+ - type: precision_at_5
5431
+ value: 33.469
5432
+ - type: recall_at_1
5433
+ value: 3.0540000000000003
5434
+ - type: recall_at_10
5435
+ value: 19.185
5436
+ - type: recall_at_100
5437
+ value: 51.056000000000004
5438
+ - type: recall_at_1000
5439
+ value: 82.814
5440
+ - type: recall_at_3
5441
+ value: 7.961
5442
+ - type: recall_at_5
5443
+ value: 11.829
5444
+ - task:
5445
+ type: Classification
5446
+ dataset:
5447
+ type: mteb/toxic_conversations_50k
5448
+ name: MTEB ToxicConversationsClassification
5449
+ config: default
5450
+ split: test
5451
+ revision: edfaf9da55d3dd50d43143d90c1ac476895ae6de
5452
+ metrics:
5453
+ - type: accuracy
5454
+ value: 64.9346
5455
+ - type: ap
5456
+ value: 12.121605736777527
5457
+ - type: f1
5458
+ value: 50.169902005887955
5459
+ - task:
5460
+ type: Classification
5461
+ dataset:
5462
+ type: mteb/tweet_sentiment_extraction
5463
+ name: MTEB TweetSentimentExtractionClassification
5464
+ config: default
5465
+ split: test
5466
+ revision: d604517c81ca91fe16a244d1248fc021f9ecee7a
5467
+ metrics:
5468
+ - type: accuracy
5469
+ value: 56.72608941709111
5470
+ - type: f1
5471
+ value: 57.0702928875253
5472
+ - task:
5473
+ type: Clustering
5474
+ dataset:
5475
+ type: mteb/twentynewsgroups-clustering
5476
+ name: MTEB TwentyNewsgroupsClustering
5477
+ config: default
5478
+ split: test
5479
+ revision: 6125ec4e24fa026cec8a478383ee943acfbd5449
5480
+ metrics:
5481
+ - type: v_measure
5482
+ value: 37.72671554400943
5483
+ - task:
5484
+ type: PairClassification
5485
+ dataset:
5486
+ type: mteb/twittersemeval2015-pairclassification
5487
+ name: MTEB TwitterSemEval2015
5488
+ config: default
5489
+ split: test
5490
+ revision: 70970daeab8776df92f5ea462b6173c0b46fd2d1
5491
+ metrics:
5492
+ - type: cos_sim_accuracy
5493
+ value: 82.84556237706384
5494
+ - type: cos_sim_ap
5495
+ value: 63.28364215788651
5496
+ - type: cos_sim_f1
5497
+ value: 60.00000000000001
5498
+ - type: cos_sim_precision
5499
+ value: 54.45161290322581
5500
+ - type: cos_sim_recall
5501
+ value: 66.80738786279683
5502
+ - type: dot_accuracy
5503
+ value: 82.84556237706384
5504
+ - type: dot_ap
5505
+ value: 63.28364302860433
5506
+ - type: dot_f1
5507
+ value: 60.00000000000001
5508
+ - type: dot_precision
5509
+ value: 54.45161290322581
5510
+ - type: dot_recall
5511
+ value: 66.80738786279683
5512
+ - type: euclidean_accuracy
5513
+ value: 82.84556237706384
5514
+ - type: euclidean_ap
5515
+ value: 63.28363625097978
5516
+ - type: euclidean_f1
5517
+ value: 60.00000000000001
5518
+ - type: euclidean_precision
5519
+ value: 54.45161290322581
5520
+ - type: euclidean_recall
5521
+ value: 66.80738786279683
5522
+ - type: manhattan_accuracy
5523
+ value: 82.86940454193241
5524
+ - type: manhattan_ap
5525
+ value: 63.244773709836764
5526
+ - type: manhattan_f1
5527
+ value: 60.12680942696495
5528
+ - type: manhattan_precision
5529
+ value: 55.00109433136353
5530
+ - type: manhattan_recall
5531
+ value: 66.3060686015831
5532
+ - type: max_accuracy
5533
+ value: 82.86940454193241
5534
+ - type: max_ap
5535
+ value: 63.28364302860433
5536
+ - type: max_f1
5537
+ value: 60.12680942696495
5538
+ - task:
5539
+ type: PairClassification
5540
+ dataset:
5541
+ type: mteb/twitterurlcorpus-pairclassification
5542
+ name: MTEB TwitterURLCorpus
5543
+ config: default
5544
+ split: test
5545
+ revision: 8b6510b0b1fa4e4c4f879467980e9be563ec1cdf
5546
+ metrics:
5547
+ - type: cos_sim_accuracy
5548
+ value: 88.32033220786278
5549
+ - type: cos_sim_ap
5550
+ value: 84.71928176006863
5551
+ - type: cos_sim_f1
5552
+ value: 76.51483333969684
5553
+ - type: cos_sim_precision
5554
+ value: 75.89184276300841
5555
+ - type: cos_sim_recall
5556
+ value: 77.14813674160764
5557
+ - type: dot_accuracy
5558
+ value: 88.32033220786278
5559
+ - type: dot_ap
5560
+ value: 84.71928330149228
5561
+ - type: dot_f1
5562
+ value: 76.51483333969684
5563
+ - type: dot_precision
5564
+ value: 75.89184276300841
5565
+ - type: dot_recall
5566
+ value: 77.14813674160764
5567
+ - type: euclidean_accuracy
5568
+ value: 88.32033220786278
5569
+ - type: euclidean_ap
5570
+ value: 84.71928045384345
5571
+ - type: euclidean_f1
5572
+ value: 76.51483333969684
5573
+ - type: euclidean_precision
5574
+ value: 75.89184276300841
5575
+ - type: euclidean_recall
5576
+ value: 77.14813674160764
5577
+ - type: manhattan_accuracy
5578
+ value: 88.27570147863545
5579
+ - type: manhattan_ap
5580
+ value: 84.68523541579755
5581
+ - type: manhattan_f1
5582
+ value: 76.51512269355146
5583
+ - type: manhattan_precision
5584
+ value: 75.62608107091825
5585
+ - type: manhattan_recall
5586
+ value: 77.42531567600862
5587
+ - type: max_accuracy
5588
+ value: 88.32033220786278
5589
+ - type: max_ap
5590
+ value: 84.71928330149228
5591
+ - type: max_f1
5592
+ value: 76.51512269355146
5593
+ - task:
5594
+ type: Clustering
5595
+ dataset:
5596
+ type: jinaai/cities_wiki_clustering
5597
+ name: MTEB WikiCitiesClustering
5598
+ config: default
5599
+ split: test
5600
+ revision: ddc9ee9242fa65332597f70e967ecc38b9d734fa
5601
+ metrics:
5602
+ - type: v_measure
5603
+ value: 85.30624598674467
5604
+ license: apache-2.0
5605
+ ---
5606
+ ---
5607
+ <h1 align="center">Snowflake's Artic-embed-s</h1>
5608
+ <h4 align="center">
5609
+ <p>
5610
+ <a href=#news>News</a> |
5611
+ <a href=#models>Models</a> |
5612
+ <a href=#usage>Usage</a> |
5613
+ <a href="#evaluation">Evaluation</a> |
5614
+ <a href="#contact">Contact</a> |
5615
+ <a href="#faq">FAQ</a>
5616
+ <a href="#license">License</a> |
5617
+ <a href="#acknowledgement">Acknowledgement</a>
5618
+ <p>
5619
+ </h4>
5620
+
5621
+
5622
+ ## News
5623
+
5624
+
5625
+ 04/16/2024: Release the ** Arctic-embed ** family of text empedding models. The releases are state-of-the-art for Retrieval quality at each of their representative size profiles. [Technical Report]() is coming shortly. For more details, please refer to our Github: [Arctic-Text-Embed](https://github.com/Snowflake/Arctic-Text-Embed).
5626
+
5627
+
5628
+ ## Models
5629
+
5630
+
5631
+ Arctic-Embed is a suite of text embedding models that focuses on creating high-quality retrieval models optimized for performance.
5632
+
5633
+
5634
+ The `arctic-embedding` models achieve **state-of-the-art performance on the MTEB/BEIR leaderboard** for each of their size variants. Evaluation is performed using these [scripts](https://github.com/Snowflake-Labs/arctic-embed/tree/main/src). As shown below, each class of model size achieves SOTA retrieval accuracy compared to other top models.
5635
+
5636
+
5637
+ The models are trained by leveraging existing open-source text representation models, such as bert-base-uncased, and are trained in a multi-stage pipeline to optimize their retrieval performance. First, the models are trained with large batches of query-document pairs where negatives are derived in-batch—pretraining leverages about 400m samples of a mix of public datasets and proprietary web search data. Following pretraining models are further optimized with long training on a smaller dataset (about 1m samples) of triplets of query, positive document, and negative document derived from hard harmful mining. Mining of the negatives and data curation is crucial to retrieval accuracy. A detailed technical report will be available shortly.
5638
+
5639
+
5640
+ | Name | MTEB Retrieval Score (NDCG @ 10) | Parameters (Millions) | Embedding Dimension |
5641
+ | ----------------------------------------------------------------------- | -------------------------------- | --------------------- | ------------------- |
5642
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-xs/) | 50.15 | 22 | 384 |
5643
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-s/) | 51.98 | 33 | 384 |
5644
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-m/) | 54.90 | 110 | 768 |
5645
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-m-long/) | 54.83 | 137 | 768 |
5646
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-l/) | 55.98 | 335 | 1024 |
5647
+
5648
+
5649
+ Aside from being great open-source models, the largest model, [arctic-embed-l](https://huggingface.co/Snowflake/arctic-embed-l/), can serve as a natural replacement for closed-source embedding, as shown below.
5650
+
5651
+
5652
+ | Model Name | MTEB Retrieval Score (NDCG @ 10) |
5653
+ | ------------------------------------------------------------------ | -------------------------------- |
5654
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-l/) | 55.98 |
5655
+ | Google-gecko-text-embedding | 55.7 |
5656
+ | text-embedding-3-large | 55.44 |
5657
+ | Cohere-embed-english-v3.0 | 55.00 |
5658
+ | bge-large-en-v1.5 | 54.29 |
5659
+
5660
+
5661
+ ### [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-xs/)
5662
+
5663
+
5664
+ This tiny model packs quite the punch based on the [all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) model. With only 22m parameters and 384 dimensions, this model should meet even the strictest latency/TCO budgets. Despite its size, its retrieval accuracy is closer to that of models with 100m paramers.
5665
+
5666
+
5667
+ | Model Name | MTEB Retrieval Score (NDCG @ 10) |
5668
+ | ------------------------------------------------------------------- | -------------------------------- |
5669
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-xs/) | 50.15 |
5670
+ | GIST-all-MiniLM-L6-v2 | 45.12 |
5671
+ | gte-tiny | 44.92 |
5672
+ | all-MiniLM-L6-v2 | 41.95 |
5673
+ | bge-micro-v2 | 42.56 |
5674
+
5675
+
5676
+ ### Arctic-embed-m
5677
+
5678
+
5679
+ Based on the [all-MiniLM-L12-v2](https://huggingface.co/intfloat/e5-base-unsupervised) model, this small model does not trade off retrieval accuracy for its small size. With only 33m parameters and 384 dimensions, this model should easily allow scaling to large datasets.
5680
+
5681
+
5682
+ | Model Name | MTEB Retrieval Score (NDCG @ 10) |
5683
+ | ------------------------------------------------------------------ | -------------------------------- |
5684
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-s/) | 51.98 |
5685
+ | bge-small-en-v1.5 | 51.68 |
5686
+ | Cohere-embed-english-light-v3.0 | 51.34 |
5687
+ | text-embedding-3-small | 51.08 |
5688
+ | e5-small-v2 | 49.04 |
5689
+
5690
+
5691
+ ### [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-m-long/)
5692
+
5693
+
5694
+ Based on the [nomic-embed-text-v1](https://huggingface.co/nomic-ai/nomic-embed-text-v1) model, this long-context variant of our medium-sized model is perfect for workloads that can be constrained by the regular 512 token context of our other models. Without the use of RPE, this model supports up to 2048 tokens. With RPE, it can scale to 8192!
5695
+
5696
+
5697
+ | Model Name | MTEB Retrieval Score (NDCG @ 10) |
5698
+ | ------------------------------------------------------------------ | -------------------------------- |
5699
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-m/) | 54.90 |
5700
+ | bge-base-en-v1.5 | 53.25 |
5701
+ | nomic-embed-text-v1.5 | 53.01 |
5702
+ | GIST-Embedding-v0 | 52.31 |
5703
+ | gte-base | 52.31 |
5704
+
5705
+
5706
+ ### Arctic-embed-m
5707
+
5708
+
5709
+ Based on the [intfloat/e5-base-unsupervised](https://huggingface.co/intfloat/e5-base-unsupervised) model, this medium model is the workhorse that provides the best retrieval performance without slowing down inference.
5710
+
5711
+
5712
+ | Model Name | MTEB Retrieval Score (NDCG @ 10) |
5713
+ | ------------------------------------------------------------------ | -------------------------------- |
5714
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-m/) | 54.90 |
5715
+ | bge-base-en-v1.5 | 53.25 |
5716
+ | nomic-embed-text-v1.5 | 53.25 |
5717
+ | GIST-Embedding-v0 | 52.31 |
5718
+ | gte-base | 52.31 |
5719
+
5720
+
5721
+ ### [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-l/)
5722
+
5723
+
5724
+ Based on the [intfloat/e5-large-unsupervised](https://huggingface.co/intfloat/e5-large-unsupervised) model, this small model does not sacrifice retrieval accuracy for its small size.
5725
+
5726
+
5727
+ | Model Name | MTEB Retrieval Score (NDCG @ 10) |
5728
+ | ------------------------------------------------------------------ | -------------------------------- |
5729
+ | [arctic-embed-s](https://huggingface.co/Snowflake/arctic-embed-l/) | 55.98 |
5730
+ | UAE-Large-V1 | 54.66 |
5731
+ | bge-large-en-v1.5 | 54.29 |
5732
+ | mxbai-embed-large-v1 | 54.39 |
5733
+ | e5-Large-v2 | 50.56 |
5734
+
5735
+
5736
+ ## Usage
5737
+
5738
+
5739
+ ### Using Huggingface transformers
5740
+
5741
+
5742
+ You can use the transformers package to use an arctic-embed model, as shown below. For optimal retrieval quality, use the CLS token to embed each text portion and use the query prefix below (just on the query).
5743
+
5744
+
5745
+
5746
+ ```python
5747
+ import torch
5748
+ from transformers import AutoModel, AutoTokenizer
5749
+
5750
+ tokenizer = AutoTokenizer.from_pretrained('Snowflake/arctic-embed-')
5751
+ model = AutoModel.from_pretrained('Snowflake/arctic-embed-s', add_pooling_layer=False)
5752
+ model.eval()
5753
+
5754
+ query_prefix = 'Represent this sentence for searching relevant passages: '
5755
+ queries = ['what is snowflake?', 'Where can I get the best tacos?']
5756
+ queries_with_prefix = ["{}{}".format(query_prefix, i) for i in queries]
5757
+ query_tokens = tokenizer(queries_with_prefix, padding=True, truncation=True, return_tensors='pt', max_length=512)
5758
+
5759
+ documents = ['The Data Cloud!', 'Mexico City of Course!']
5760
+ document_tokens = tokenizer(documents, padding=True, truncation=True, return_tensors='pt', max_length=512)
5761
+
5762
+ # Compute token embeddings
5763
+ with torch.no_grad():
5764
+ query_embeddings = model(**query_tokens)[0][:, 0]
5765
+ doument_embeddings = model(**document_tokens)[0][:, 0]
5766
+
5767
+
5768
+ # normalize embeddings
5769
+ query_embeddings = torch.nn.functional.normalize(query_embeddings, p=2, dim=1)
5770
+ doument_embeddings = torch.nn.functional.normalize(doument_embeddings, p=2, dim=1)
5771
+
5772
+ scores = torch.mm(query_embeddings, doument_embeddings.transpose(0, 1))
5773
+ for query, query_scores in zip(queries, scores):
5774
+ doc_score_pairs = list(zip(documents, query_scores))
5775
+ doc_score_pairs = sorted(doc_score_pairs, key=lambda x: x[1], reverse=True)
5776
+ #Output passages & scores
5777
+ print("Query:", query)
5778
+ for document, score in doc_score_pairs:
5779
+ print(score, document)
5780
+ ```
5781
+
5782
+
5783
+ If you use the long context model with more than 2048 tokens, ensure that you initialize the model like below instead. This will use [RPE](https://arxiv.org/abs/2104.09864) to allow up to 8192 tokens.
5784
+
5785
+
5786
+ ``` py
5787
+ model = AutoModel.from_pretrained('Snowflake/arctic-embed-m-long', trust_remote_code=True, rotary_scaling_factor=2)
5788
+ ```
5789
+
5790
+
5791
+ ## FAQ
5792
+
5793
+
5794
+ TBD
5795
+
5796
+
5797
+ ## Contact
5798
+
5799
+
5800
+ Feel free to open an issue or pull request if you have any questions or suggestions about this project.
5801
+ You also can email Daniel Campos(daniel.campos@snowflake.com).
5802
+
5803
+
5804
+ ## License
5805
+
5806
+
5807
+ Arctic is licensed under the [Apache-2](https://www.apache.org/licenses/LICENSE-2.0). The released models can be used for commercial purposes free of charge.
5808
+
5809
+
5810
+ ## Acknowledgement
5811
+
5812
+
5813
+ We want to thank the open-source community, which has provided the great building blocks upon which we could make our models.
5814
+ We thank our modeling engineers, Danmei Xu, Luke Merrick, Gaurav Nuti, and Daniel Campos, for making these great models possible.
5815
+ We thank our leadership, Himabindu Pucha, Kelvin So, Vivek Raghunathan, and Sridhar Ramaswamy, for supporting this work.
5816
+ We also thank the open-source community for producing the great models we could build on top of and making these releases possible.
5817
+ Finally, we thank the researchers who created BEIR and MTEB benchmarks.
5818
+ It is largely thanks to their tireless work to define what better looks like that we could improve model performance.