SaylorTwift HF staff commited on
Commit
1d77b9c
1 Parent(s): d21f1e2

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +1284 -0
README.md ADDED
@@ -0,0 +1,1284 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ pretty_name: Evaluation run of Dampish/StellarX-4B-V0
3
+ dataset_summary: "Dataset automatically created during the evaluation run of model\
4
+ \ [Dampish/StellarX-4B-V0](https://huggingface.co/Dampish/StellarX-4B-V0) on the\
5
+ \ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
6
+ \nThe dataset is composed of 61 configuration, each one coresponding to one of the\
7
+ \ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
8
+ \ found as a specific split in each configuration, the split being named using the\
9
+ \ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
10
+ \nAn additional configuration \"results\" store all the aggregated results of the\
11
+ \ run (and is used to compute and display the agregated metrics on the [Open LLM\
12
+ \ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
13
+ \nTo load the details from a run, you can for instance do the following:\n```python\n\
14
+ from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Dampish__StellarX-4B-V0\"\
15
+ ,\n\t\"harness_truthfulqa_mc_0\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\
16
+ \nThese are the [latest results from run 2023-10-03T17:57:03.227360](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__StellarX-4B-V0/blob/main/results_2023-10-03T17-57-03.227360.json)(note\
17
+ \ that their might be results for other tasks in the repos if successive evals didn't\
18
+ \ cover the same tasks. You find each in the results and the \"latest\" split for\
19
+ \ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27266383036075503,\n\
20
+ \ \"acc_stderr\": 0.03224051042838464,\n \"acc_norm\": 0.27616554014040245,\n\
21
+ \ \"acc_norm_stderr\": 0.03224574795269476,\n \"mc1\": 0.20685434516523868,\n\
22
+ \ \"mc1_stderr\": 0.014179591496728343,\n \"mc2\": 0.34296822571733665,\n\
23
+ \ \"mc2_stderr\": 0.013628027163865984\n },\n \"harness|arc:challenge|25\"\
24
+ : {\n \"acc\": 0.32337883959044367,\n \"acc_stderr\": 0.013669421630012125,\n\
25
+ \ \"acc_norm\": 0.36945392491467577,\n \"acc_norm_stderr\": 0.0141045783664919\n\
26
+ \ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4584744074885481,\n\
27
+ \ \"acc_stderr\": 0.004972543127767877,\n \"acc_norm\": 0.6190001991635132,\n\
28
+ \ \"acc_norm_stderr\": 0.004846400325585233\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
29
+ : {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
30
+ \ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
31
+ \ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
32
+ \ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
33
+ \ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
34
+ : {\n \"acc\": 0.34210526315789475,\n \"acc_stderr\": 0.03860731599316092,\n\
35
+ \ \"acc_norm\": 0.34210526315789475,\n \"acc_norm_stderr\": 0.03860731599316092\n\
36
+ \ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.23,\n\
37
+ \ \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n \
38
+ \ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
39
+ : {\n \"acc\": 0.2830188679245283,\n \"acc_stderr\": 0.0277242364927009,\n\
40
+ \ \"acc_norm\": 0.2830188679245283,\n \"acc_norm_stderr\": 0.0277242364927009\n\
41
+ \ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
42
+ \ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.24305555555555555,\n\
43
+ \ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
44
+ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
45
+ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
46
+ \ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
47
+ \ \"acc_stderr\": 0.04725815626252606,\n \"acc_norm\": 0.33,\n \
48
+ \ \"acc_norm_stderr\": 0.04725815626252606\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
49
+ : {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
50
+ \ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
51
+ \ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.26011560693641617,\n\
52
+ \ \"acc_stderr\": 0.03345036916788991,\n \"acc_norm\": 0.26011560693641617,\n\
53
+ \ \"acc_norm_stderr\": 0.03345036916788991\n },\n \"harness|hendrycksTest-college_physics|5\"\
54
+ : {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
55
+ \ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
56
+ \ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
57
+ \ 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.28,\n\
58
+ \ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
59
+ : {\n \"acc\": 0.2851063829787234,\n \"acc_stderr\": 0.02951319662553935,\n\
60
+ \ \"acc_norm\": 0.2851063829787234,\n \"acc_norm_stderr\": 0.02951319662553935\n\
61
+ \ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
62
+ \ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
63
+ \ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
64
+ : {\n \"acc\": 0.296551724137931,\n \"acc_stderr\": 0.03806142687309993,\n\
65
+ \ \"acc_norm\": 0.296551724137931,\n \"acc_norm_stderr\": 0.03806142687309993\n\
66
+ \ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
67
+ : 0.26455026455026454,\n \"acc_stderr\": 0.022717467897708617,\n \"\
68
+ acc_norm\": 0.26455026455026454,\n \"acc_norm_stderr\": 0.022717467897708617\n\
69
+ \ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
70
+ \ \"acc_stderr\": 0.03200686497287394,\n \"acc_norm\": 0.15079365079365079,\n\
71
+ \ \"acc_norm_stderr\": 0.03200686497287394\n },\n \"harness|hendrycksTest-global_facts|5\"\
72
+ : {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
73
+ \ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
74
+ \ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
75
+ : 0.22903225806451613,\n \"acc_stderr\": 0.02390491431178265,\n \"\
76
+ acc_norm\": 0.22903225806451613,\n \"acc_norm_stderr\": 0.02390491431178265\n\
77
+ \ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
78
+ : 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694433,\n \"\
79
+ acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694433\n\
80
+ \ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
81
+ \ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
82
+ : 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
83
+ : {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
84
+ \ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
85
+ \ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
86
+ : 0.35353535353535354,\n \"acc_stderr\": 0.03406086723547153,\n \"\
87
+ acc_norm\": 0.35353535353535354,\n \"acc_norm_stderr\": 0.03406086723547153\n\
88
+ \ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
89
+ \ \"acc\": 0.26424870466321243,\n \"acc_stderr\": 0.031821550509166484,\n\
90
+ \ \"acc_norm\": 0.26424870466321243,\n \"acc_norm_stderr\": 0.031821550509166484\n\
91
+ \ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
92
+ \ \"acc\": 0.2512820512820513,\n \"acc_stderr\": 0.02199201666237056,\n \
93
+ \ \"acc_norm\": 0.2512820512820513,\n \"acc_norm_stderr\": 0.02199201666237056\n\
94
+ \ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
95
+ acc\": 0.2222222222222222,\n \"acc_stderr\": 0.025348097468097838,\n \
96
+ \ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.025348097468097838\n\
97
+ \ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
98
+ \ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.0256494702658892,\n \
99
+ \ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.0256494702658892\n\
100
+ \ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
101
+ : 0.271523178807947,\n \"acc_stderr\": 0.03631329803969653,\n \"acc_norm\"\
102
+ : 0.271523178807947,\n \"acc_norm_stderr\": 0.03631329803969653\n },\n\
103
+ \ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.3284403669724771,\n\
104
+ \ \"acc_stderr\": 0.020135902797298395,\n \"acc_norm\": 0.3284403669724771,\n\
105
+ \ \"acc_norm_stderr\": 0.020135902797298395\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
106
+ : {\n \"acc\": 0.3287037037037037,\n \"acc_stderr\": 0.032036140846700596,\n\
107
+ \ \"acc_norm\": 0.3287037037037037,\n \"acc_norm_stderr\": 0.032036140846700596\n\
108
+ \ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
109
+ : 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
110
+ acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
111
+ \ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
112
+ acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n \
113
+ \ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
114
+ \ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.11659192825112108,\n\
115
+ \ \"acc_stderr\": 0.02153963981624447,\n \"acc_norm\": 0.11659192825112108,\n\
116
+ \ \"acc_norm_stderr\": 0.02153963981624447\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
117
+ : {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728744,\n\
118
+ \ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728744\n\
119
+ \ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
120
+ \ 0.4049586776859504,\n \"acc_stderr\": 0.044811377559424694,\n \"\
121
+ acc_norm\": 0.4049586776859504,\n \"acc_norm_stderr\": 0.044811377559424694\n\
122
+ \ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
123
+ \ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
124
+ \ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
125
+ : {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
126
+ \ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
127
+ \ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
128
+ \ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
129
+ \ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
130
+ : {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.045416094465039476,\n\
131
+ \ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.045416094465039476\n\
132
+ \ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.29914529914529914,\n\
133
+ \ \"acc_stderr\": 0.029996951858349483,\n \"acc_norm\": 0.29914529914529914,\n\
134
+ \ \"acc_norm_stderr\": 0.029996951858349483\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
135
+ : {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
136
+ \ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
137
+ \ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28735632183908044,\n\
138
+ \ \"acc_stderr\": 0.0161824107306827,\n \"acc_norm\": 0.28735632183908044,\n\
139
+ \ \"acc_norm_stderr\": 0.0161824107306827\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
140
+ : {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
141
+ \ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
142
+ \ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27262569832402234,\n\
143
+ \ \"acc_stderr\": 0.014893391735249588,\n \"acc_norm\": 0.27262569832402234,\n\
144
+ \ \"acc_norm_stderr\": 0.014893391735249588\n },\n \"harness|hendrycksTest-nutrition|5\"\
145
+ : {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\
146
+ \ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\
147
+ \ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
148
+ \ \"acc_stderr\": 0.02575586592263294,\n \"acc_norm\": 0.28938906752411575,\n\
149
+ \ \"acc_norm_stderr\": 0.02575586592263294\n },\n \"harness|hendrycksTest-prehistory|5\"\
150
+ : {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460852,\n\
151
+ \ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460852\n\
152
+ \ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
153
+ acc\": 0.2695035460992908,\n \"acc_stderr\": 0.026469036818590627,\n \
154
+ \ \"acc_norm\": 0.2695035460992908,\n \"acc_norm_stderr\": 0.026469036818590627\n\
155
+ \ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2633637548891786,\n\
156
+ \ \"acc_stderr\": 0.011249506403605291,\n \"acc_norm\": 0.2633637548891786,\n\
157
+ \ \"acc_norm_stderr\": 0.011249506403605291\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
158
+ : {\n \"acc\": 0.2426470588235294,\n \"acc_stderr\": 0.02604066247420126,\n\
159
+ \ \"acc_norm\": 0.2426470588235294,\n \"acc_norm_stderr\": 0.02604066247420126\n\
160
+ \ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
161
+ acc\": 0.24019607843137256,\n \"acc_stderr\": 0.017282760695167414,\n \
162
+ \ \"acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.017282760695167414\n\
163
+ \ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
164
+ \ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n\
165
+ \ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
166
+ : {\n \"acc\": 0.23673469387755103,\n \"acc_stderr\": 0.02721283588407314,\n\
167
+ \ \"acc_norm\": 0.23673469387755103,\n \"acc_norm_stderr\": 0.02721283588407314\n\
168
+ \ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
169
+ \ \"acc_stderr\": 0.03036049015401468,\n \"acc_norm\": 0.24378109452736318,\n\
170
+ \ \"acc_norm_stderr\": 0.03036049015401468\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
171
+ : {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
172
+ \ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
173
+ \ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.23493975903614459,\n\
174
+ \ \"acc_stderr\": 0.03300533186128922,\n \"acc_norm\": 0.23493975903614459,\n\
175
+ \ \"acc_norm_stderr\": 0.03300533186128922\n },\n \"harness|hendrycksTest-world_religions|5\"\
176
+ : {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
177
+ \ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
178
+ \ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.20685434516523868,\n\
179
+ \ \"mc1_stderr\": 0.014179591496728343,\n \"mc2\": 0.34296822571733665,\n\
180
+ \ \"mc2_stderr\": 0.013628027163865984\n }\n}\n```"
181
+ repo_url: https://huggingface.co/Dampish/StellarX-4B-V0
182
+ leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
183
+ point_of_contact: clementine@hf.co
184
+ configs:
185
+ - config_name: harness_arc_challenge_25
186
+ data_files:
187
+ - split: 2023_10_03T17_57_03.227360
188
+ path:
189
+ - '**/details_harness|arc:challenge|25_2023-10-03T17-57-03.227360.parquet'
190
+ - split: latest
191
+ path:
192
+ - '**/details_harness|arc:challenge|25_2023-10-03T17-57-03.227360.parquet'
193
+ - config_name: harness_hellaswag_10
194
+ data_files:
195
+ - split: 2023_10_03T17_57_03.227360
196
+ path:
197
+ - '**/details_harness|hellaswag|10_2023-10-03T17-57-03.227360.parquet'
198
+ - split: latest
199
+ path:
200
+ - '**/details_harness|hellaswag|10_2023-10-03T17-57-03.227360.parquet'
201
+ - config_name: harness_hendrycksTest_5
202
+ data_files:
203
+ - split: 2023_10_03T17_57_03.227360
204
+ path:
205
+ - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
206
+ - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
207
+ - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
208
+ - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
209
+ - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
210
+ - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
211
+ - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
212
+ - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
213
+ - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
214
+ - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
215
+ - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
216
+ - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
217
+ - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
218
+ - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
219
+ - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
220
+ - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
221
+ - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
222
+ - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
223
+ - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
224
+ - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
225
+ - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
226
+ - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
227
+ - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
228
+ - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
229
+ - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
230
+ - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
231
+ - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
232
+ - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
233
+ - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
234
+ - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
235
+ - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
236
+ - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
237
+ - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
238
+ - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
239
+ - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
240
+ - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
241
+ - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
242
+ - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
243
+ - '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
244
+ - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
245
+ - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
246
+ - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
247
+ - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
248
+ - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
249
+ - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
250
+ - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
251
+ - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
252
+ - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
253
+ - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
254
+ - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
255
+ - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
256
+ - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
257
+ - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
258
+ - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
259
+ - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
260
+ - '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
261
+ - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
262
+ - split: latest
263
+ path:
264
+ - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
265
+ - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
266
+ - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
267
+ - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
268
+ - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
269
+ - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
270
+ - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
271
+ - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
272
+ - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
273
+ - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
274
+ - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
275
+ - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
276
+ - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
277
+ - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
278
+ - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
279
+ - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
280
+ - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
281
+ - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
282
+ - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
283
+ - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
284
+ - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
285
+ - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
286
+ - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
287
+ - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
288
+ - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
289
+ - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
290
+ - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
291
+ - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
292
+ - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
293
+ - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
294
+ - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
295
+ - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
296
+ - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
297
+ - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
298
+ - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
299
+ - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
300
+ - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
301
+ - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
302
+ - '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
303
+ - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
304
+ - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
305
+ - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
306
+ - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
307
+ - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
308
+ - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
309
+ - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
310
+ - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
311
+ - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
312
+ - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
313
+ - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
314
+ - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
315
+ - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
316
+ - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
317
+ - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
318
+ - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
319
+ - '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
320
+ - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
321
+ - config_name: harness_hendrycksTest_abstract_algebra_5
322
+ data_files:
323
+ - split: 2023_10_03T17_57_03.227360
324
+ path:
325
+ - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
326
+ - split: latest
327
+ path:
328
+ - '**/details_harness|hendrycksTest-abstract_algebra|5_2023-10-03T17-57-03.227360.parquet'
329
+ - config_name: harness_hendrycksTest_anatomy_5
330
+ data_files:
331
+ - split: 2023_10_03T17_57_03.227360
332
+ path:
333
+ - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
334
+ - split: latest
335
+ path:
336
+ - '**/details_harness|hendrycksTest-anatomy|5_2023-10-03T17-57-03.227360.parquet'
337
+ - config_name: harness_hendrycksTest_astronomy_5
338
+ data_files:
339
+ - split: 2023_10_03T17_57_03.227360
340
+ path:
341
+ - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
342
+ - split: latest
343
+ path:
344
+ - '**/details_harness|hendrycksTest-astronomy|5_2023-10-03T17-57-03.227360.parquet'
345
+ - config_name: harness_hendrycksTest_business_ethics_5
346
+ data_files:
347
+ - split: 2023_10_03T17_57_03.227360
348
+ path:
349
+ - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
350
+ - split: latest
351
+ path:
352
+ - '**/details_harness|hendrycksTest-business_ethics|5_2023-10-03T17-57-03.227360.parquet'
353
+ - config_name: harness_hendrycksTest_clinical_knowledge_5
354
+ data_files:
355
+ - split: 2023_10_03T17_57_03.227360
356
+ path:
357
+ - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
358
+ - split: latest
359
+ path:
360
+ - '**/details_harness|hendrycksTest-clinical_knowledge|5_2023-10-03T17-57-03.227360.parquet'
361
+ - config_name: harness_hendrycksTest_college_biology_5
362
+ data_files:
363
+ - split: 2023_10_03T17_57_03.227360
364
+ path:
365
+ - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
366
+ - split: latest
367
+ path:
368
+ - '**/details_harness|hendrycksTest-college_biology|5_2023-10-03T17-57-03.227360.parquet'
369
+ - config_name: harness_hendrycksTest_college_chemistry_5
370
+ data_files:
371
+ - split: 2023_10_03T17_57_03.227360
372
+ path:
373
+ - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
374
+ - split: latest
375
+ path:
376
+ - '**/details_harness|hendrycksTest-college_chemistry|5_2023-10-03T17-57-03.227360.parquet'
377
+ - config_name: harness_hendrycksTest_college_computer_science_5
378
+ data_files:
379
+ - split: 2023_10_03T17_57_03.227360
380
+ path:
381
+ - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
382
+ - split: latest
383
+ path:
384
+ - '**/details_harness|hendrycksTest-college_computer_science|5_2023-10-03T17-57-03.227360.parquet'
385
+ - config_name: harness_hendrycksTest_college_mathematics_5
386
+ data_files:
387
+ - split: 2023_10_03T17_57_03.227360
388
+ path:
389
+ - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
390
+ - split: latest
391
+ path:
392
+ - '**/details_harness|hendrycksTest-college_mathematics|5_2023-10-03T17-57-03.227360.parquet'
393
+ - config_name: harness_hendrycksTest_college_medicine_5
394
+ data_files:
395
+ - split: 2023_10_03T17_57_03.227360
396
+ path:
397
+ - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
398
+ - split: latest
399
+ path:
400
+ - '**/details_harness|hendrycksTest-college_medicine|5_2023-10-03T17-57-03.227360.parquet'
401
+ - config_name: harness_hendrycksTest_college_physics_5
402
+ data_files:
403
+ - split: 2023_10_03T17_57_03.227360
404
+ path:
405
+ - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
406
+ - split: latest
407
+ path:
408
+ - '**/details_harness|hendrycksTest-college_physics|5_2023-10-03T17-57-03.227360.parquet'
409
+ - config_name: harness_hendrycksTest_computer_security_5
410
+ data_files:
411
+ - split: 2023_10_03T17_57_03.227360
412
+ path:
413
+ - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
414
+ - split: latest
415
+ path:
416
+ - '**/details_harness|hendrycksTest-computer_security|5_2023-10-03T17-57-03.227360.parquet'
417
+ - config_name: harness_hendrycksTest_conceptual_physics_5
418
+ data_files:
419
+ - split: 2023_10_03T17_57_03.227360
420
+ path:
421
+ - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
422
+ - split: latest
423
+ path:
424
+ - '**/details_harness|hendrycksTest-conceptual_physics|5_2023-10-03T17-57-03.227360.parquet'
425
+ - config_name: harness_hendrycksTest_econometrics_5
426
+ data_files:
427
+ - split: 2023_10_03T17_57_03.227360
428
+ path:
429
+ - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
430
+ - split: latest
431
+ path:
432
+ - '**/details_harness|hendrycksTest-econometrics|5_2023-10-03T17-57-03.227360.parquet'
433
+ - config_name: harness_hendrycksTest_electrical_engineering_5
434
+ data_files:
435
+ - split: 2023_10_03T17_57_03.227360
436
+ path:
437
+ - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
438
+ - split: latest
439
+ path:
440
+ - '**/details_harness|hendrycksTest-electrical_engineering|5_2023-10-03T17-57-03.227360.parquet'
441
+ - config_name: harness_hendrycksTest_elementary_mathematics_5
442
+ data_files:
443
+ - split: 2023_10_03T17_57_03.227360
444
+ path:
445
+ - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
446
+ - split: latest
447
+ path:
448
+ - '**/details_harness|hendrycksTest-elementary_mathematics|5_2023-10-03T17-57-03.227360.parquet'
449
+ - config_name: harness_hendrycksTest_formal_logic_5
450
+ data_files:
451
+ - split: 2023_10_03T17_57_03.227360
452
+ path:
453
+ - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
454
+ - split: latest
455
+ path:
456
+ - '**/details_harness|hendrycksTest-formal_logic|5_2023-10-03T17-57-03.227360.parquet'
457
+ - config_name: harness_hendrycksTest_global_facts_5
458
+ data_files:
459
+ - split: 2023_10_03T17_57_03.227360
460
+ path:
461
+ - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
462
+ - split: latest
463
+ path:
464
+ - '**/details_harness|hendrycksTest-global_facts|5_2023-10-03T17-57-03.227360.parquet'
465
+ - config_name: harness_hendrycksTest_high_school_biology_5
466
+ data_files:
467
+ - split: 2023_10_03T17_57_03.227360
468
+ path:
469
+ - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
470
+ - split: latest
471
+ path:
472
+ - '**/details_harness|hendrycksTest-high_school_biology|5_2023-10-03T17-57-03.227360.parquet'
473
+ - config_name: harness_hendrycksTest_high_school_chemistry_5
474
+ data_files:
475
+ - split: 2023_10_03T17_57_03.227360
476
+ path:
477
+ - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
478
+ - split: latest
479
+ path:
480
+ - '**/details_harness|hendrycksTest-high_school_chemistry|5_2023-10-03T17-57-03.227360.parquet'
481
+ - config_name: harness_hendrycksTest_high_school_computer_science_5
482
+ data_files:
483
+ - split: 2023_10_03T17_57_03.227360
484
+ path:
485
+ - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
486
+ - split: latest
487
+ path:
488
+ - '**/details_harness|hendrycksTest-high_school_computer_science|5_2023-10-03T17-57-03.227360.parquet'
489
+ - config_name: harness_hendrycksTest_high_school_european_history_5
490
+ data_files:
491
+ - split: 2023_10_03T17_57_03.227360
492
+ path:
493
+ - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
494
+ - split: latest
495
+ path:
496
+ - '**/details_harness|hendrycksTest-high_school_european_history|5_2023-10-03T17-57-03.227360.parquet'
497
+ - config_name: harness_hendrycksTest_high_school_geography_5
498
+ data_files:
499
+ - split: 2023_10_03T17_57_03.227360
500
+ path:
501
+ - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
502
+ - split: latest
503
+ path:
504
+ - '**/details_harness|hendrycksTest-high_school_geography|5_2023-10-03T17-57-03.227360.parquet'
505
+ - config_name: harness_hendrycksTest_high_school_government_and_politics_5
506
+ data_files:
507
+ - split: 2023_10_03T17_57_03.227360
508
+ path:
509
+ - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
510
+ - split: latest
511
+ path:
512
+ - '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2023-10-03T17-57-03.227360.parquet'
513
+ - config_name: harness_hendrycksTest_high_school_macroeconomics_5
514
+ data_files:
515
+ - split: 2023_10_03T17_57_03.227360
516
+ path:
517
+ - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
518
+ - split: latest
519
+ path:
520
+ - '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2023-10-03T17-57-03.227360.parquet'
521
+ - config_name: harness_hendrycksTest_high_school_mathematics_5
522
+ data_files:
523
+ - split: 2023_10_03T17_57_03.227360
524
+ path:
525
+ - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
526
+ - split: latest
527
+ path:
528
+ - '**/details_harness|hendrycksTest-high_school_mathematics|5_2023-10-03T17-57-03.227360.parquet'
529
+ - config_name: harness_hendrycksTest_high_school_microeconomics_5
530
+ data_files:
531
+ - split: 2023_10_03T17_57_03.227360
532
+ path:
533
+ - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
534
+ - split: latest
535
+ path:
536
+ - '**/details_harness|hendrycksTest-high_school_microeconomics|5_2023-10-03T17-57-03.227360.parquet'
537
+ - config_name: harness_hendrycksTest_high_school_physics_5
538
+ data_files:
539
+ - split: 2023_10_03T17_57_03.227360
540
+ path:
541
+ - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
542
+ - split: latest
543
+ path:
544
+ - '**/details_harness|hendrycksTest-high_school_physics|5_2023-10-03T17-57-03.227360.parquet'
545
+ - config_name: harness_hendrycksTest_high_school_psychology_5
546
+ data_files:
547
+ - split: 2023_10_03T17_57_03.227360
548
+ path:
549
+ - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
550
+ - split: latest
551
+ path:
552
+ - '**/details_harness|hendrycksTest-high_school_psychology|5_2023-10-03T17-57-03.227360.parquet'
553
+ - config_name: harness_hendrycksTest_high_school_statistics_5
554
+ data_files:
555
+ - split: 2023_10_03T17_57_03.227360
556
+ path:
557
+ - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
558
+ - split: latest
559
+ path:
560
+ - '**/details_harness|hendrycksTest-high_school_statistics|5_2023-10-03T17-57-03.227360.parquet'
561
+ - config_name: harness_hendrycksTest_high_school_us_history_5
562
+ data_files:
563
+ - split: 2023_10_03T17_57_03.227360
564
+ path:
565
+ - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
566
+ - split: latest
567
+ path:
568
+ - '**/details_harness|hendrycksTest-high_school_us_history|5_2023-10-03T17-57-03.227360.parquet'
569
+ - config_name: harness_hendrycksTest_high_school_world_history_5
570
+ data_files:
571
+ - split: 2023_10_03T17_57_03.227360
572
+ path:
573
+ - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
574
+ - split: latest
575
+ path:
576
+ - '**/details_harness|hendrycksTest-high_school_world_history|5_2023-10-03T17-57-03.227360.parquet'
577
+ - config_name: harness_hendrycksTest_human_aging_5
578
+ data_files:
579
+ - split: 2023_10_03T17_57_03.227360
580
+ path:
581
+ - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
582
+ - split: latest
583
+ path:
584
+ - '**/details_harness|hendrycksTest-human_aging|5_2023-10-03T17-57-03.227360.parquet'
585
+ - config_name: harness_hendrycksTest_human_sexuality_5
586
+ data_files:
587
+ - split: 2023_10_03T17_57_03.227360
588
+ path:
589
+ - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
590
+ - split: latest
591
+ path:
592
+ - '**/details_harness|hendrycksTest-human_sexuality|5_2023-10-03T17-57-03.227360.parquet'
593
+ - config_name: harness_hendrycksTest_international_law_5
594
+ data_files:
595
+ - split: 2023_10_03T17_57_03.227360
596
+ path:
597
+ - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
598
+ - split: latest
599
+ path:
600
+ - '**/details_harness|hendrycksTest-international_law|5_2023-10-03T17-57-03.227360.parquet'
601
+ - config_name: harness_hendrycksTest_jurisprudence_5
602
+ data_files:
603
+ - split: 2023_10_03T17_57_03.227360
604
+ path:
605
+ - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
606
+ - split: latest
607
+ path:
608
+ - '**/details_harness|hendrycksTest-jurisprudence|5_2023-10-03T17-57-03.227360.parquet'
609
+ - config_name: harness_hendrycksTest_logical_fallacies_5
610
+ data_files:
611
+ - split: 2023_10_03T17_57_03.227360
612
+ path:
613
+ - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
614
+ - split: latest
615
+ path:
616
+ - '**/details_harness|hendrycksTest-logical_fallacies|5_2023-10-03T17-57-03.227360.parquet'
617
+ - config_name: harness_hendrycksTest_machine_learning_5
618
+ data_files:
619
+ - split: 2023_10_03T17_57_03.227360
620
+ path:
621
+ - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
622
+ - split: latest
623
+ path:
624
+ - '**/details_harness|hendrycksTest-machine_learning|5_2023-10-03T17-57-03.227360.parquet'
625
+ - config_name: harness_hendrycksTest_management_5
626
+ data_files:
627
+ - split: 2023_10_03T17_57_03.227360
628
+ path:
629
+ - '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
630
+ - split: latest
631
+ path:
632
+ - '**/details_harness|hendrycksTest-management|5_2023-10-03T17-57-03.227360.parquet'
633
+ - config_name: harness_hendrycksTest_marketing_5
634
+ data_files:
635
+ - split: 2023_10_03T17_57_03.227360
636
+ path:
637
+ - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
638
+ - split: latest
639
+ path:
640
+ - '**/details_harness|hendrycksTest-marketing|5_2023-10-03T17-57-03.227360.parquet'
641
+ - config_name: harness_hendrycksTest_medical_genetics_5
642
+ data_files:
643
+ - split: 2023_10_03T17_57_03.227360
644
+ path:
645
+ - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
646
+ - split: latest
647
+ path:
648
+ - '**/details_harness|hendrycksTest-medical_genetics|5_2023-10-03T17-57-03.227360.parquet'
649
+ - config_name: harness_hendrycksTest_miscellaneous_5
650
+ data_files:
651
+ - split: 2023_10_03T17_57_03.227360
652
+ path:
653
+ - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
654
+ - split: latest
655
+ path:
656
+ - '**/details_harness|hendrycksTest-miscellaneous|5_2023-10-03T17-57-03.227360.parquet'
657
+ - config_name: harness_hendrycksTest_moral_disputes_5
658
+ data_files:
659
+ - split: 2023_10_03T17_57_03.227360
660
+ path:
661
+ - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
662
+ - split: latest
663
+ path:
664
+ - '**/details_harness|hendrycksTest-moral_disputes|5_2023-10-03T17-57-03.227360.parquet'
665
+ - config_name: harness_hendrycksTest_moral_scenarios_5
666
+ data_files:
667
+ - split: 2023_10_03T17_57_03.227360
668
+ path:
669
+ - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
670
+ - split: latest
671
+ path:
672
+ - '**/details_harness|hendrycksTest-moral_scenarios|5_2023-10-03T17-57-03.227360.parquet'
673
+ - config_name: harness_hendrycksTest_nutrition_5
674
+ data_files:
675
+ - split: 2023_10_03T17_57_03.227360
676
+ path:
677
+ - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
678
+ - split: latest
679
+ path:
680
+ - '**/details_harness|hendrycksTest-nutrition|5_2023-10-03T17-57-03.227360.parquet'
681
+ - config_name: harness_hendrycksTest_philosophy_5
682
+ data_files:
683
+ - split: 2023_10_03T17_57_03.227360
684
+ path:
685
+ - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
686
+ - split: latest
687
+ path:
688
+ - '**/details_harness|hendrycksTest-philosophy|5_2023-10-03T17-57-03.227360.parquet'
689
+ - config_name: harness_hendrycksTest_prehistory_5
690
+ data_files:
691
+ - split: 2023_10_03T17_57_03.227360
692
+ path:
693
+ - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
694
+ - split: latest
695
+ path:
696
+ - '**/details_harness|hendrycksTest-prehistory|5_2023-10-03T17-57-03.227360.parquet'
697
+ - config_name: harness_hendrycksTest_professional_accounting_5
698
+ data_files:
699
+ - split: 2023_10_03T17_57_03.227360
700
+ path:
701
+ - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
702
+ - split: latest
703
+ path:
704
+ - '**/details_harness|hendrycksTest-professional_accounting|5_2023-10-03T17-57-03.227360.parquet'
705
+ - config_name: harness_hendrycksTest_professional_law_5
706
+ data_files:
707
+ - split: 2023_10_03T17_57_03.227360
708
+ path:
709
+ - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
710
+ - split: latest
711
+ path:
712
+ - '**/details_harness|hendrycksTest-professional_law|5_2023-10-03T17-57-03.227360.parquet'
713
+ - config_name: harness_hendrycksTest_professional_medicine_5
714
+ data_files:
715
+ - split: 2023_10_03T17_57_03.227360
716
+ path:
717
+ - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
718
+ - split: latest
719
+ path:
720
+ - '**/details_harness|hendrycksTest-professional_medicine|5_2023-10-03T17-57-03.227360.parquet'
721
+ - config_name: harness_hendrycksTest_professional_psychology_5
722
+ data_files:
723
+ - split: 2023_10_03T17_57_03.227360
724
+ path:
725
+ - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
726
+ - split: latest
727
+ path:
728
+ - '**/details_harness|hendrycksTest-professional_psychology|5_2023-10-03T17-57-03.227360.parquet'
729
+ - config_name: harness_hendrycksTest_public_relations_5
730
+ data_files:
731
+ - split: 2023_10_03T17_57_03.227360
732
+ path:
733
+ - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
734
+ - split: latest
735
+ path:
736
+ - '**/details_harness|hendrycksTest-public_relations|5_2023-10-03T17-57-03.227360.parquet'
737
+ - config_name: harness_hendrycksTest_security_studies_5
738
+ data_files:
739
+ - split: 2023_10_03T17_57_03.227360
740
+ path:
741
+ - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
742
+ - split: latest
743
+ path:
744
+ - '**/details_harness|hendrycksTest-security_studies|5_2023-10-03T17-57-03.227360.parquet'
745
+ - config_name: harness_hendrycksTest_sociology_5
746
+ data_files:
747
+ - split: 2023_10_03T17_57_03.227360
748
+ path:
749
+ - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
750
+ - split: latest
751
+ path:
752
+ - '**/details_harness|hendrycksTest-sociology|5_2023-10-03T17-57-03.227360.parquet'
753
+ - config_name: harness_hendrycksTest_us_foreign_policy_5
754
+ data_files:
755
+ - split: 2023_10_03T17_57_03.227360
756
+ path:
757
+ - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
758
+ - split: latest
759
+ path:
760
+ - '**/details_harness|hendrycksTest-us_foreign_policy|5_2023-10-03T17-57-03.227360.parquet'
761
+ - config_name: harness_hendrycksTest_virology_5
762
+ data_files:
763
+ - split: 2023_10_03T17_57_03.227360
764
+ path:
765
+ - '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
766
+ - split: latest
767
+ path:
768
+ - '**/details_harness|hendrycksTest-virology|5_2023-10-03T17-57-03.227360.parquet'
769
+ - config_name: harness_hendrycksTest_world_religions_5
770
+ data_files:
771
+ - split: 2023_10_03T17_57_03.227360
772
+ path:
773
+ - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
774
+ - split: latest
775
+ path:
776
+ - '**/details_harness|hendrycksTest-world_religions|5_2023-10-03T17-57-03.227360.parquet'
777
+ - config_name: harness_truthfulqa_mc_0
778
+ data_files:
779
+ - split: 2023_10_03T17_57_03.227360
780
+ path:
781
+ - '**/details_harness|truthfulqa:mc|0_2023-10-03T17-57-03.227360.parquet'
782
+ - split: latest
783
+ path:
784
+ - '**/details_harness|truthfulqa:mc|0_2023-10-03T17-57-03.227360.parquet'
785
+ - config_name: results
786
+ data_files:
787
+ - split: 2023_10_03T17_57_03.227360
788
+ path:
789
+ - results_2023-10-03T17-57-03.227360.parquet
790
+ - split: latest
791
+ path:
792
+ - results_2023-10-03T17-57-03.227360.parquet
793
+ ---
794
+
795
+ # Dataset Card for Evaluation run of Dampish/StellarX-4B-V0
796
+
797
+ ## Dataset Description
798
+
799
+ - **Homepage:**
800
+ - **Repository:** https://huggingface.co/Dampish/StellarX-4B-V0
801
+ - **Paper:**
802
+ - **Leaderboard:** https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
803
+ - **Point of Contact:** clementine@hf.co
804
+
805
+ ### Dataset Summary
806
+
807
+ Dataset automatically created during the evaluation run of model [Dampish/StellarX-4B-V0](https://huggingface.co/Dampish/StellarX-4B-V0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
808
+
809
+ The dataset is composed of 61 configuration, each one coresponding to one of the evaluated task.
810
+
811
+ The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
812
+
813
+ An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the agregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
814
+
815
+ To load the details from a run, you can for instance do the following:
816
+ ```python
817
+ from datasets import load_dataset
818
+ data = load_dataset("open-llm-leaderboard/details_Dampish__StellarX-4B-V0",
819
+ "harness_truthfulqa_mc_0",
820
+ split="train")
821
+ ```
822
+
823
+ ## Latest results
824
+
825
+ These are the [latest results from run 2023-10-03T17:57:03.227360](https://huggingface.co/datasets/open-llm-leaderboard/details_Dampish__StellarX-4B-V0/blob/main/results_2023-10-03T17-57-03.227360.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
826
+
827
+ ```python
828
+ {
829
+ "all": {
830
+ "acc": 0.27266383036075503,
831
+ "acc_stderr": 0.03224051042838464,
832
+ "acc_norm": 0.27616554014040245,
833
+ "acc_norm_stderr": 0.03224574795269476,
834
+ "mc1": 0.20685434516523868,
835
+ "mc1_stderr": 0.014179591496728343,
836
+ "mc2": 0.34296822571733665,
837
+ "mc2_stderr": 0.013628027163865984
838
+ },
839
+ "harness|arc:challenge|25": {
840
+ "acc": 0.32337883959044367,
841
+ "acc_stderr": 0.013669421630012125,
842
+ "acc_norm": 0.36945392491467577,
843
+ "acc_norm_stderr": 0.0141045783664919
844
+ },
845
+ "harness|hellaswag|10": {
846
+ "acc": 0.4584744074885481,
847
+ "acc_stderr": 0.004972543127767877,
848
+ "acc_norm": 0.6190001991635132,
849
+ "acc_norm_stderr": 0.004846400325585233
850
+ },
851
+ "harness|hendrycksTest-abstract_algebra|5": {
852
+ "acc": 0.34,
853
+ "acc_stderr": 0.047609522856952365,
854
+ "acc_norm": 0.34,
855
+ "acc_norm_stderr": 0.047609522856952365
856
+ },
857
+ "harness|hendrycksTest-anatomy|5": {
858
+ "acc": 0.2518518518518518,
859
+ "acc_stderr": 0.03749850709174022,
860
+ "acc_norm": 0.2518518518518518,
861
+ "acc_norm_stderr": 0.03749850709174022
862
+ },
863
+ "harness|hendrycksTest-astronomy|5": {
864
+ "acc": 0.34210526315789475,
865
+ "acc_stderr": 0.03860731599316092,
866
+ "acc_norm": 0.34210526315789475,
867
+ "acc_norm_stderr": 0.03860731599316092
868
+ },
869
+ "harness|hendrycksTest-business_ethics|5": {
870
+ "acc": 0.23,
871
+ "acc_stderr": 0.04229525846816506,
872
+ "acc_norm": 0.23,
873
+ "acc_norm_stderr": 0.04229525846816506
874
+ },
875
+ "harness|hendrycksTest-clinical_knowledge|5": {
876
+ "acc": 0.2830188679245283,
877
+ "acc_stderr": 0.0277242364927009,
878
+ "acc_norm": 0.2830188679245283,
879
+ "acc_norm_stderr": 0.0277242364927009
880
+ },
881
+ "harness|hendrycksTest-college_biology|5": {
882
+ "acc": 0.24305555555555555,
883
+ "acc_stderr": 0.03586879280080341,
884
+ "acc_norm": 0.24305555555555555,
885
+ "acc_norm_stderr": 0.03586879280080341
886
+ },
887
+ "harness|hendrycksTest-college_chemistry|5": {
888
+ "acc": 0.26,
889
+ "acc_stderr": 0.0440844002276808,
890
+ "acc_norm": 0.26,
891
+ "acc_norm_stderr": 0.0440844002276808
892
+ },
893
+ "harness|hendrycksTest-college_computer_science|5": {
894
+ "acc": 0.33,
895
+ "acc_stderr": 0.04725815626252606,
896
+ "acc_norm": 0.33,
897
+ "acc_norm_stderr": 0.04725815626252606
898
+ },
899
+ "harness|hendrycksTest-college_mathematics|5": {
900
+ "acc": 0.36,
901
+ "acc_stderr": 0.04824181513244218,
902
+ "acc_norm": 0.36,
903
+ "acc_norm_stderr": 0.04824181513244218
904
+ },
905
+ "harness|hendrycksTest-college_medicine|5": {
906
+ "acc": 0.26011560693641617,
907
+ "acc_stderr": 0.03345036916788991,
908
+ "acc_norm": 0.26011560693641617,
909
+ "acc_norm_stderr": 0.03345036916788991
910
+ },
911
+ "harness|hendrycksTest-college_physics|5": {
912
+ "acc": 0.20588235294117646,
913
+ "acc_stderr": 0.04023382273617747,
914
+ "acc_norm": 0.20588235294117646,
915
+ "acc_norm_stderr": 0.04023382273617747
916
+ },
917
+ "harness|hendrycksTest-computer_security|5": {
918
+ "acc": 0.28,
919
+ "acc_stderr": 0.045126085985421276,
920
+ "acc_norm": 0.28,
921
+ "acc_norm_stderr": 0.045126085985421276
922
+ },
923
+ "harness|hendrycksTest-conceptual_physics|5": {
924
+ "acc": 0.2851063829787234,
925
+ "acc_stderr": 0.02951319662553935,
926
+ "acc_norm": 0.2851063829787234,
927
+ "acc_norm_stderr": 0.02951319662553935
928
+ },
929
+ "harness|hendrycksTest-econometrics|5": {
930
+ "acc": 0.2719298245614035,
931
+ "acc_stderr": 0.04185774424022056,
932
+ "acc_norm": 0.2719298245614035,
933
+ "acc_norm_stderr": 0.04185774424022056
934
+ },
935
+ "harness|hendrycksTest-electrical_engineering|5": {
936
+ "acc": 0.296551724137931,
937
+ "acc_stderr": 0.03806142687309993,
938
+ "acc_norm": 0.296551724137931,
939
+ "acc_norm_stderr": 0.03806142687309993
940
+ },
941
+ "harness|hendrycksTest-elementary_mathematics|5": {
942
+ "acc": 0.26455026455026454,
943
+ "acc_stderr": 0.022717467897708617,
944
+ "acc_norm": 0.26455026455026454,
945
+ "acc_norm_stderr": 0.022717467897708617
946
+ },
947
+ "harness|hendrycksTest-formal_logic|5": {
948
+ "acc": 0.15079365079365079,
949
+ "acc_stderr": 0.03200686497287394,
950
+ "acc_norm": 0.15079365079365079,
951
+ "acc_norm_stderr": 0.03200686497287394
952
+ },
953
+ "harness|hendrycksTest-global_facts|5": {
954
+ "acc": 0.33,
955
+ "acc_stderr": 0.047258156262526045,
956
+ "acc_norm": 0.33,
957
+ "acc_norm_stderr": 0.047258156262526045
958
+ },
959
+ "harness|hendrycksTest-high_school_biology|5": {
960
+ "acc": 0.22903225806451613,
961
+ "acc_stderr": 0.02390491431178265,
962
+ "acc_norm": 0.22903225806451613,
963
+ "acc_norm_stderr": 0.02390491431178265
964
+ },
965
+ "harness|hendrycksTest-high_school_chemistry|5": {
966
+ "acc": 0.2512315270935961,
967
+ "acc_stderr": 0.030516530732694433,
968
+ "acc_norm": 0.2512315270935961,
969
+ "acc_norm_stderr": 0.030516530732694433
970
+ },
971
+ "harness|hendrycksTest-high_school_computer_science|5": {
972
+ "acc": 0.24,
973
+ "acc_stderr": 0.04292346959909282,
974
+ "acc_norm": 0.24,
975
+ "acc_norm_stderr": 0.04292346959909282
976
+ },
977
+ "harness|hendrycksTest-high_school_european_history|5": {
978
+ "acc": 0.23030303030303031,
979
+ "acc_stderr": 0.03287666758603489,
980
+ "acc_norm": 0.23030303030303031,
981
+ "acc_norm_stderr": 0.03287666758603489
982
+ },
983
+ "harness|hendrycksTest-high_school_geography|5": {
984
+ "acc": 0.35353535353535354,
985
+ "acc_stderr": 0.03406086723547153,
986
+ "acc_norm": 0.35353535353535354,
987
+ "acc_norm_stderr": 0.03406086723547153
988
+ },
989
+ "harness|hendrycksTest-high_school_government_and_politics|5": {
990
+ "acc": 0.26424870466321243,
991
+ "acc_stderr": 0.031821550509166484,
992
+ "acc_norm": 0.26424870466321243,
993
+ "acc_norm_stderr": 0.031821550509166484
994
+ },
995
+ "harness|hendrycksTest-high_school_macroeconomics|5": {
996
+ "acc": 0.2512820512820513,
997
+ "acc_stderr": 0.02199201666237056,
998
+ "acc_norm": 0.2512820512820513,
999
+ "acc_norm_stderr": 0.02199201666237056
1000
+ },
1001
+ "harness|hendrycksTest-high_school_mathematics|5": {
1002
+ "acc": 0.2222222222222222,
1003
+ "acc_stderr": 0.025348097468097838,
1004
+ "acc_norm": 0.2222222222222222,
1005
+ "acc_norm_stderr": 0.025348097468097838
1006
+ },
1007
+ "harness|hendrycksTest-high_school_microeconomics|5": {
1008
+ "acc": 0.19327731092436976,
1009
+ "acc_stderr": 0.0256494702658892,
1010
+ "acc_norm": 0.19327731092436976,
1011
+ "acc_norm_stderr": 0.0256494702658892
1012
+ },
1013
+ "harness|hendrycksTest-high_school_physics|5": {
1014
+ "acc": 0.271523178807947,
1015
+ "acc_stderr": 0.03631329803969653,
1016
+ "acc_norm": 0.271523178807947,
1017
+ "acc_norm_stderr": 0.03631329803969653
1018
+ },
1019
+ "harness|hendrycksTest-high_school_psychology|5": {
1020
+ "acc": 0.3284403669724771,
1021
+ "acc_stderr": 0.020135902797298395,
1022
+ "acc_norm": 0.3284403669724771,
1023
+ "acc_norm_stderr": 0.020135902797298395
1024
+ },
1025
+ "harness|hendrycksTest-high_school_statistics|5": {
1026
+ "acc": 0.3287037037037037,
1027
+ "acc_stderr": 0.032036140846700596,
1028
+ "acc_norm": 0.3287037037037037,
1029
+ "acc_norm_stderr": 0.032036140846700596
1030
+ },
1031
+ "harness|hendrycksTest-high_school_us_history|5": {
1032
+ "acc": 0.24019607843137256,
1033
+ "acc_stderr": 0.02998373305591362,
1034
+ "acc_norm": 0.24019607843137256,
1035
+ "acc_norm_stderr": 0.02998373305591362
1036
+ },
1037
+ "harness|hendrycksTest-high_school_world_history|5": {
1038
+ "acc": 0.270042194092827,
1039
+ "acc_stderr": 0.028900721906293426,
1040
+ "acc_norm": 0.270042194092827,
1041
+ "acc_norm_stderr": 0.028900721906293426
1042
+ },
1043
+ "harness|hendrycksTest-human_aging|5": {
1044
+ "acc": 0.11659192825112108,
1045
+ "acc_stderr": 0.02153963981624447,
1046
+ "acc_norm": 0.11659192825112108,
1047
+ "acc_norm_stderr": 0.02153963981624447
1048
+ },
1049
+ "harness|hendrycksTest-human_sexuality|5": {
1050
+ "acc": 0.24427480916030533,
1051
+ "acc_stderr": 0.03768335959728744,
1052
+ "acc_norm": 0.24427480916030533,
1053
+ "acc_norm_stderr": 0.03768335959728744
1054
+ },
1055
+ "harness|hendrycksTest-international_law|5": {
1056
+ "acc": 0.4049586776859504,
1057
+ "acc_stderr": 0.044811377559424694,
1058
+ "acc_norm": 0.4049586776859504,
1059
+ "acc_norm_stderr": 0.044811377559424694
1060
+ },
1061
+ "harness|hendrycksTest-jurisprudence|5": {
1062
+ "acc": 0.25,
1063
+ "acc_stderr": 0.04186091791394607,
1064
+ "acc_norm": 0.25,
1065
+ "acc_norm_stderr": 0.04186091791394607
1066
+ },
1067
+ "harness|hendrycksTest-logical_fallacies|5": {
1068
+ "acc": 0.24539877300613497,
1069
+ "acc_stderr": 0.03380939813943354,
1070
+ "acc_norm": 0.24539877300613497,
1071
+ "acc_norm_stderr": 0.03380939813943354
1072
+ },
1073
+ "harness|hendrycksTest-machine_learning|5": {
1074
+ "acc": 0.2767857142857143,
1075
+ "acc_stderr": 0.04246624336697625,
1076
+ "acc_norm": 0.2767857142857143,
1077
+ "acc_norm_stderr": 0.04246624336697625
1078
+ },
1079
+ "harness|hendrycksTest-management|5": {
1080
+ "acc": 0.30097087378640774,
1081
+ "acc_stderr": 0.045416094465039476,
1082
+ "acc_norm": 0.30097087378640774,
1083
+ "acc_norm_stderr": 0.045416094465039476
1084
+ },
1085
+ "harness|hendrycksTest-marketing|5": {
1086
+ "acc": 0.29914529914529914,
1087
+ "acc_stderr": 0.029996951858349483,
1088
+ "acc_norm": 0.29914529914529914,
1089
+ "acc_norm_stderr": 0.029996951858349483
1090
+ },
1091
+ "harness|hendrycksTest-medical_genetics|5": {
1092
+ "acc": 0.27,
1093
+ "acc_stderr": 0.044619604333847394,
1094
+ "acc_norm": 0.27,
1095
+ "acc_norm_stderr": 0.044619604333847394
1096
+ },
1097
+ "harness|hendrycksTest-miscellaneous|5": {
1098
+ "acc": 0.28735632183908044,
1099
+ "acc_stderr": 0.0161824107306827,
1100
+ "acc_norm": 0.28735632183908044,
1101
+ "acc_norm_stderr": 0.0161824107306827
1102
+ },
1103
+ "harness|hendrycksTest-moral_disputes|5": {
1104
+ "acc": 0.24855491329479767,
1105
+ "acc_stderr": 0.023267528432100174,
1106
+ "acc_norm": 0.24855491329479767,
1107
+ "acc_norm_stderr": 0.023267528432100174
1108
+ },
1109
+ "harness|hendrycksTest-moral_scenarios|5": {
1110
+ "acc": 0.27262569832402234,
1111
+ "acc_stderr": 0.014893391735249588,
1112
+ "acc_norm": 0.27262569832402234,
1113
+ "acc_norm_stderr": 0.014893391735249588
1114
+ },
1115
+ "harness|hendrycksTest-nutrition|5": {
1116
+ "acc": 0.2549019607843137,
1117
+ "acc_stderr": 0.02495418432487991,
1118
+ "acc_norm": 0.2549019607843137,
1119
+ "acc_norm_stderr": 0.02495418432487991
1120
+ },
1121
+ "harness|hendrycksTest-philosophy|5": {
1122
+ "acc": 0.28938906752411575,
1123
+ "acc_stderr": 0.02575586592263294,
1124
+ "acc_norm": 0.28938906752411575,
1125
+ "acc_norm_stderr": 0.02575586592263294
1126
+ },
1127
+ "harness|hendrycksTest-prehistory|5": {
1128
+ "acc": 0.2654320987654321,
1129
+ "acc_stderr": 0.024569223600460852,
1130
+ "acc_norm": 0.2654320987654321,
1131
+ "acc_norm_stderr": 0.024569223600460852
1132
+ },
1133
+ "harness|hendrycksTest-professional_accounting|5": {
1134
+ "acc": 0.2695035460992908,
1135
+ "acc_stderr": 0.026469036818590627,
1136
+ "acc_norm": 0.2695035460992908,
1137
+ "acc_norm_stderr": 0.026469036818590627
1138
+ },
1139
+ "harness|hendrycksTest-professional_law|5": {
1140
+ "acc": 0.2633637548891786,
1141
+ "acc_stderr": 0.011249506403605291,
1142
+ "acc_norm": 0.2633637548891786,
1143
+ "acc_norm_stderr": 0.011249506403605291
1144
+ },
1145
+ "harness|hendrycksTest-professional_medicine|5": {
1146
+ "acc": 0.2426470588235294,
1147
+ "acc_stderr": 0.02604066247420126,
1148
+ "acc_norm": 0.2426470588235294,
1149
+ "acc_norm_stderr": 0.02604066247420126
1150
+ },
1151
+ "harness|hendrycksTest-professional_psychology|5": {
1152
+ "acc": 0.24019607843137256,
1153
+ "acc_stderr": 0.017282760695167414,
1154
+ "acc_norm": 0.24019607843137256,
1155
+ "acc_norm_stderr": 0.017282760695167414
1156
+ },
1157
+ "harness|hendrycksTest-public_relations|5": {
1158
+ "acc": 0.33636363636363636,
1159
+ "acc_stderr": 0.04525393596302506,
1160
+ "acc_norm": 0.33636363636363636,
1161
+ "acc_norm_stderr": 0.04525393596302506
1162
+ },
1163
+ "harness|hendrycksTest-security_studies|5": {
1164
+ "acc": 0.23673469387755103,
1165
+ "acc_stderr": 0.02721283588407314,
1166
+ "acc_norm": 0.23673469387755103,
1167
+ "acc_norm_stderr": 0.02721283588407314
1168
+ },
1169
+ "harness|hendrycksTest-sociology|5": {
1170
+ "acc": 0.24378109452736318,
1171
+ "acc_stderr": 0.03036049015401468,
1172
+ "acc_norm": 0.24378109452736318,
1173
+ "acc_norm_stderr": 0.03036049015401468
1174
+ },
1175
+ "harness|hendrycksTest-us_foreign_policy|5": {
1176
+ "acc": 0.26,
1177
+ "acc_stderr": 0.04408440022768078,
1178
+ "acc_norm": 0.26,
1179
+ "acc_norm_stderr": 0.04408440022768078
1180
+ },
1181
+ "harness|hendrycksTest-virology|5": {
1182
+ "acc": 0.23493975903614459,
1183
+ "acc_stderr": 0.03300533186128922,
1184
+ "acc_norm": 0.23493975903614459,
1185
+ "acc_norm_stderr": 0.03300533186128922
1186
+ },
1187
+ "harness|hendrycksTest-world_religions|5": {
1188
+ "acc": 0.29239766081871343,
1189
+ "acc_stderr": 0.034886477134579215,
1190
+ "acc_norm": 0.29239766081871343,
1191
+ "acc_norm_stderr": 0.034886477134579215
1192
+ },
1193
+ "harness|truthfulqa:mc|0": {
1194
+ "mc1": 0.20685434516523868,
1195
+ "mc1_stderr": 0.014179591496728343,
1196
+ "mc2": 0.34296822571733665,
1197
+ "mc2_stderr": 0.013628027163865984
1198
+ }
1199
+ }
1200
+ ```
1201
+
1202
+ ### Supported Tasks and Leaderboards
1203
+
1204
+ [More Information Needed]
1205
+
1206
+ ### Languages
1207
+
1208
+ [More Information Needed]
1209
+
1210
+ ## Dataset Structure
1211
+
1212
+ ### Data Instances
1213
+
1214
+ [More Information Needed]
1215
+
1216
+ ### Data Fields
1217
+
1218
+ [More Information Needed]
1219
+
1220
+ ### Data Splits
1221
+
1222
+ [More Information Needed]
1223
+
1224
+ ## Dataset Creation
1225
+
1226
+ ### Curation Rationale
1227
+
1228
+ [More Information Needed]
1229
+
1230
+ ### Source Data
1231
+
1232
+ #### Initial Data Collection and Normalization
1233
+
1234
+ [More Information Needed]
1235
+
1236
+ #### Who are the source language producers?
1237
+
1238
+ [More Information Needed]
1239
+
1240
+ ### Annotations
1241
+
1242
+ #### Annotation process
1243
+
1244
+ [More Information Needed]
1245
+
1246
+ #### Who are the annotators?
1247
+
1248
+ [More Information Needed]
1249
+
1250
+ ### Personal and Sensitive Information
1251
+
1252
+ [More Information Needed]
1253
+
1254
+ ## Considerations for Using the Data
1255
+
1256
+ ### Social Impact of Dataset
1257
+
1258
+ [More Information Needed]
1259
+
1260
+ ### Discussion of Biases
1261
+
1262
+ [More Information Needed]
1263
+
1264
+ ### Other Known Limitations
1265
+
1266
+ [More Information Needed]
1267
+
1268
+ ## Additional Information
1269
+
1270
+ ### Dataset Curators
1271
+
1272
+ [More Information Needed]
1273
+
1274
+ ### Licensing Information
1275
+
1276
+ [More Information Needed]
1277
+
1278
+ ### Citation Information
1279
+
1280
+ [More Information Needed]
1281
+
1282
+ ### Contributions
1283
+
1284
+ [More Information Needed]