model
stringlengths
8
39
lower
float64
625
1.32k
rating
float64
642
1.33k
upper
float64
659
1.34k
error_y
float64
7.5
16.3
error_y_minus
float64
7.26
16.7
rating_rounded
float64
642
1.33k
DeepSeek-Coder-V2-Instruct (2024-07-24)
1,317.709562
1,328.380333
1,339.631477
11.251143
10.670771
1,328.38
Llama-3.1-405B-Instruct
1,255.535813
1,265.956492
1,275.984701
10.028209
10.420679
1,265.96
Mistral-Large-Instruct-2407
1,245.452689
1,254.762936
1,264.422743
9.659808
9.310246
1,254.76
Llama-3.1-70B-Instruct
1,231.281515
1,241.465538
1,250.903118
9.43758
10.184022
1,241.47
DeepSeek-V2-Chat (2024-06-28)
1,210.267059
1,218.433092
1,226.763568
8.330476
8.166033
1,218.43
GPT-4o-mini-2024-07-18
1,196.219913
1,204.637129
1,213.833
9.195871
8.417216
1,204.64
Llama-3-70B-Synthia-v3.5
1,181.03523
1,189.083359
1,197.71602
8.632662
8.048128
1,189.08
Claude-3.5-Sonnet-20240620
1,158.980732
1,166.99857
1,174.8275
7.828931
8.017838
1,167
DeepSeek-Coder-V2-Instruct
1,130.418591
1,139.079719
1,147.49014
8.410421
8.661128
1,139.08
WhiteRabbitNeo-33B-v1.5
1,126.122928
1,135.335503
1,144.665767
9.330265
9.212575
1,135.34
Athene-70B
1,123.020226
1,131.945045
1,141.290825
9.345779
8.924819
1,131.95
Hermes-2-Pro-Llama-3-70B
1,119.122732
1,127.283072
1,135.610233
8.327161
8.16034
1,127.28
Phi-3-Mini-128K-Instruct (June 2024)
1,115.986868
1,123.394147
1,131.868269
8.474122
7.407279
1,123.39
Gemma-2-27B-Instruct
1,108.624783
1,116.971724
1,125.964458
8.992734
8.346941
1,116.97
Hermes-2-Theta-Llama-3-70B
1,096.47112
1,105.164721
1,114.033204
8.868483
8.693601
1,105.16
Gemini-1.5-Pro-API-0514
1,088.221162
1,095.483648
1,104.054786
8.571137
7.262487
1,095.48
Nxcode-CQ-7B-Orpo
1,085.42812
1,093.943875
1,103.507258
9.563383
8.515755
1,093.94
Tess-v2.5.2-Qwen2-72B
1,081.064012
1,090.31957
1,100.048494
9.728924
9.255558
1,090.32
Yi-Large
1,081.595833
1,090.202791
1,099.189844
8.987053
8.606958
1,090.2
CodeGeex4-All-9B
1,077.499046
1,087.129825
1,095.786502
8.656677
9.630778
1,087.13
Claude-3-Opus-20240229
1,079.036054
1,086.845606
1,094.84143
7.995824
7.809552
1,086.85
Gemini-1.5-Flash-API-0514
1,077.746887
1,086.02523
1,094.35593
8.3307
8.278343
1,086.03
GPT-4-Turbo-2024-04-09
1,069.45717
1,078.346746
1,085.843134
7.496388
8.889575
1,078.35
ReflectionCoder-DS-33B
1,069.520915
1,077.849433
1,085.634621
7.785187
8.328519
1,077.85
Codestral-22B-v0.1
1,053.230425
1,061.599669
1,070.502404
8.902735
8.369245
1,061.6
Claude-3-Sonnet-20240229
1,048.522214
1,057.67902
1,066.113977
8.434957
9.156806
1,057.68
Llama-3-70B-Instruct
1,046.996305
1,055.258439
1,063.77949
8.52105
8.262134
1,055.26
Llama-3.1-8B-Instruct
1,033.149973
1,042.610182
1,051.777137
9.166955
9.460209
1,042.61
AutoCoder
1,034.033742
1,042.141684
1,051.125018
8.983334
8.107942
1,042.14
Qwen2-72B-Chat
1,033.243485
1,041.561211
1,050.036414
8.475203
8.317726
1,041.56
Artigenz-Coder-DS-6.7B
1,026.595081
1,035.82544
1,045.014684
9.189244
9.230359
1,035.83
GPT-4o-2024-05-13
1,025.526537
1,033.09784
1,040.944852
7.847012
7.571303
1,033.1
Phind-CodeLlama-34B-v2
1,023.195062
1,032.534509
1,042.053012
9.518503
9.339447
1,032.53
DeepSeek-Coder-V2-Lite-Instruct
1,023.138189
1,031.680203
1,040.089027
8.408824
8.542014
1,031.68
ReflectionCoder-DS-6.7B
1,019.914974
1,029.046562
1,038.359967
9.313405
9.131588
1,029.05
WaveCoder-Ultra-6.7B
1,004.538213
1,013.661743
1,022.899721
9.237978
9.123529
1,013.66
Claude-3-Haiku-20240307
995.563236
1,004.295433
1,013.125799
8.830366
8.732198
1,004.3
DeepSeek-Coder-33B-Instruct
992.896253
1,001.545834
1,010.931136
9.385302
8.649581
1,001.55
ReflectionCoder-CL-34B
989.849804
998.639387
1,007.571972
8.932584
8.789584
998.64
Mixtral-8x22B-Instruct
987.721955
995.887453
1,003.418741
7.531288
8.165498
995.89
OpenCodeInterpreter-DS-6.7B
980.953282
990.070362
999.234009
9.163646
9.11708
990.07
DeepSeek-V2-Chat
980.277394
988.555506
997.506231
8.950725
8.278112
988.56
OpenChat-3.6-8B-20240522
978.449625
987.997821
997.695739
9.697918
9.548196
988
Command R+
978.599757
987.55993
996.419066
8.859136
8.960173
987.56
Codestral-Mamba
970.109455
979.238518
989.070213
9.831694
9.129063
979.24
CodeQwen1.5-7B-Chat
961.651604
971.178937
980.132792
8.953855
9.527333
971.18
Granite-Code-34B-Instruct
958.696066
966.975394
975.958429
8.983036
8.279327
966.98
GPT-4-0613
955.97847
965.142804
974.22678
9.083976
9.164334
965.14
Phi-3-Medium-128K-Instruct
955.073531
963.792943
974.366979
10.574036
8.719412
963.79
GPT-3.5-Turbo-0125
951.36168
960.771642
969.167405
8.395764
9.409962
960.77
Phi-3-Mini-128K-Instruct (Old)
948.383953
958.680247
968.580687
9.900441
10.296293
958.68
Qwen1.5-110B-Chat
941.495078
952.775466
961.79412
9.018654
11.280388
952.78
Phi-3-Small-128K-Instruct
938.677698
948.44834
957.116627
8.668287
9.770642
948.45
AutoCoder-QW-7B
938.299685
947.64176
957.325245
9.683485
9.342075
947.64
AutoCoder-S-6.7B
929.667406
938.757613
948.601211
9.843597
9.090207
938.76
Qwen2-57B-A14B
927.202158
936.390689
945.748418
9.357728
9.188532
936.39
Gemma-2-9B-Instruct
922.088
931.202727
940.261237
9.05851
9.114727
931.2
CodeLlama-70B-Instruct
922.228747
931.118777
941.353054
10.234277
8.890029
931.12
Mistral-Small-2402
920.03059
929.06857
938.710924
9.642353
9.03798
929.07
ReflectionCoder-CL-7B
918.032275
928.453378
938.539881
10.086503
10.421104
928.45
Qwen2-7B-Instruct
918.169213
928.387203
937.797492
9.41029
10.217989
928.39
DeepSeek-Coder-6.7B-Instruct
908.881485
919.49104
930.402065
10.911025
10.609555
919.49
Yi-1.5-9B-Chat
910.206009
919.126532
929.423052
10.29652
8.920523
919.13
Qwen1.5-72B-Chat
899.568038
909.3352
918.386754
9.051554
9.767162
909.34
Granite-Code-20B-Instruct
892.103915
902.966362
912.70388
9.737518
10.862447
902.97
StarCoder2-15B-Instruct-v0.1
888.952289
899.141441
909.78914
10.647699
10.189152
899.14
Qwen1.5-32B-Chat
877.275017
888.015863
898.354125
10.338261
10.740846
888.02
Magicoder-S-DS-6.7B
863.228913
873.787316
884.313399
10.526083
10.558403
873.79
CodeLlama-34B-Instruct
862.210024
873.78551
885.05695
11.271439
11.575486
873.79
Yi-1.5-34B-Chat
859.717103
870.607635
882.059694
11.452059
10.890531
870.61
CodeGemma-7B-Instruct
850.443597
860.699384
870.506768
9.807384
10.255787
860.7
Mistral-Large-2402
824.436058
835.89154
846.287191
10.395651
11.455482
835.89
Llama-3-8B-Instruct
819.781838
831.013946
841.907523
10.893576
11.232108
831.01
CodeLlama-13B-Instruct
805.178203
818.307272
829.907201
11.59993
13.129068
818.31
InternLM2.5-7B-Chat
803.944552
815.374365
827.27765
11.903285
11.429814
815.37
Mistral-7B-Instruct-v0.3
785.686017
797.560097
810.503916
12.943819
11.87408
797.56
Yi-1.5-6B-Chat
774.959698
787.98304
800.507677
12.524637
13.023342
787.98
OpenCodeInterpreter-DS-1.3B
737.237349
750.876801
764.624384
13.747583
13.639452
750.88
CodeLlama-7B-Instruct
686.317635
701.149363
713.617964
12.468602
14.831728
701.15
DeepSeek-Coder-1.3B-Instruct
625.478362
642.188436
658.514603
16.326168
16.710073
642.19

Dataset Card for "bigcodebench-hard-test-elo"

More Information needed

Downloads last month
83
Edit dataset card