datasetId
stringlengths 5
121
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
19.3M
| likes
int64 0
5.14k
| tags
sequencelengths 1
7.92k
| task_categories
sequencelengths 0
40
⌀ | createdAt
unknown | card
stringlengths 19
977k
|
---|---|---|---|---|---|---|---|---|
open-llm-leaderboard-old/details_Aratako__c4ai-command-r-v01-japanese-instruct | open-llm-leaderboard-old | "2024-04-06T15:29:26Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-06T15:28:55Z" | ---
pretty_name: Evaluation run of Aratako/c4ai-command-r-v01-japanese-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Aratako/c4ai-command-r-v01-japanese-instruct](https://huggingface.co/Aratako/c4ai-command-r-v01-japanese-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Aratako__c4ai-command-r-v01-japanese-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T15:26:41.930065](https://huggingface.co/datasets/open-llm-leaderboard/details_Aratako__c4ai-command-r-v01-japanese-instruct/blob/main/results_2024-04-06T15-26-41.930065.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6762099088456663,\n\
\ \"acc_stderr\": 0.031218722032505204,\n \"acc_norm\": 0.6788659428254314,\n\
\ \"acc_norm_stderr\": 0.031843003332528694,\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.5100708052144302,\n\
\ \"mc2_stderr\": 0.014817034256109527\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6220136518771331,\n \"acc_stderr\": 0.014169664520303098,\n\
\ \"acc_norm\": 0.658703071672355,\n \"acc_norm_stderr\": 0.013855831287497724\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6586337382991436,\n\
\ \"acc_stderr\": 0.004731989816563666,\n \"acc_norm\": 0.8562039434375622,\n\
\ \"acc_norm_stderr\": 0.003501657107386699\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932262,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932262\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146268,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146268\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.47619047619047616,\n \"acc_stderr\": 0.025722097064388518,\n \"\
acc_norm\": 0.47619047619047616,\n \"acc_norm_stderr\": 0.025722097064388518\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7741935483870968,\n \"acc_stderr\": 0.023785577884181012,\n \"\
acc_norm\": 0.7741935483870968,\n \"acc_norm_stderr\": 0.023785577884181012\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n \"\
acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.030874145136562076,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.030874145136562076\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629293,\n \"\
acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629293\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.02407869658063547,\n \
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.02407869658063547\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.35555555555555557,\n \"acc_stderr\": 0.02918571494985739,\n \
\ \"acc_norm\": 0.35555555555555557,\n \"acc_norm_stderr\": 0.02918571494985739\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7310924369747899,\n \"acc_stderr\": 0.028801392193631273,\n\
\ \"acc_norm\": 0.7310924369747899,\n \"acc_norm_stderr\": 0.028801392193631273\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553332,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553332\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8607594936708861,\n \"acc_stderr\": 0.022535526352692705,\n \
\ \"acc_norm\": 0.8607594936708861,\n \"acc_norm_stderr\": 0.022535526352692705\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.02799153425851952,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.02799153425851952\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8760330578512396,\n \"acc_stderr\": 0.03008309871603521,\n \"\
acc_norm\": 0.8760330578512396,\n \"acc_norm_stderr\": 0.03008309871603521\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8425925925925926,\n\
\ \"acc_stderr\": 0.03520703990517963,\n \"acc_norm\": 0.8425925925925926,\n\
\ \"acc_norm_stderr\": 0.03520703990517963\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8159509202453987,\n \"acc_stderr\": 0.030446777687971726,\n\
\ \"acc_norm\": 0.8159509202453987,\n \"acc_norm_stderr\": 0.030446777687971726\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.9017094017094017,\n\
\ \"acc_stderr\": 0.019503444900757567,\n \"acc_norm\": 0.9017094017094017,\n\
\ \"acc_norm_stderr\": 0.019503444900757567\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036845\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8646232439335888,\n\
\ \"acc_stderr\": 0.012234384586856495,\n \"acc_norm\": 0.8646232439335888,\n\
\ \"acc_norm_stderr\": 0.012234384586856495\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.02454761779480383,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.02454761779480383\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5441340782122905,\n\
\ \"acc_stderr\": 0.016657229424586306,\n \"acc_norm\": 0.5441340782122905,\n\
\ \"acc_norm_stderr\": 0.016657229424586306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182651,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182651\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.024723861504771696,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.024723861504771696\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7561728395061729,\n \"acc_stderr\": 0.023891879541959597,\n\
\ \"acc_norm\": 0.7561728395061729,\n \"acc_norm_stderr\": 0.023891879541959597\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5425531914893617,\n \"acc_stderr\": 0.02971928127223685,\n \
\ \"acc_norm\": 0.5425531914893617,\n \"acc_norm_stderr\": 0.02971928127223685\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5469361147327249,\n\
\ \"acc_stderr\": 0.012713845972358986,\n \"acc_norm\": 0.5469361147327249,\n\
\ \"acc_norm_stderr\": 0.012713845972358986\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.7222222222222222,\n \"acc_stderr\": 0.018120224251484577,\n \
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.018120224251484577\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.043502714429232425,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.043502714429232425\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7755102040816326,\n \"acc_stderr\": 0.02671143055553842,\n\
\ \"acc_norm\": 0.7755102040816326,\n \"acc_norm_stderr\": 0.02671143055553842\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8706467661691543,\n\
\ \"acc_stderr\": 0.02372983088101853,\n \"acc_norm\": 0.8706467661691543,\n\
\ \"acc_norm_stderr\": 0.02372983088101853\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896309,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896309\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160872,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160872\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3463892288861689,\n\
\ \"mc1_stderr\": 0.01665699710912514,\n \"mc2\": 0.5100708052144302,\n\
\ \"mc2_stderr\": 0.014817034256109527\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.829518547750592,\n \"acc_stderr\": 0.010569021122825916\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6004548900682335,\n \
\ \"acc_stderr\": 0.013491660298815994\n }\n}\n```"
repo_url: https://huggingface.co/Aratako/c4ai-command-r-v01-japanese-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|arc:challenge|25_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|gsm8k|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hellaswag|10_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T15-26-41.930065.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T15-26-41.930065.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- '**/details_harness|winogrande|5_2024-04-06T15-26-41.930065.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T15-26-41.930065.parquet'
- config_name: results
data_files:
- split: 2024_04_06T15_26_41.930065
path:
- results_2024-04-06T15-26-41.930065.parquet
- split: latest
path:
- results_2024-04-06T15-26-41.930065.parquet
---
# Dataset Card for Evaluation run of Aratako/c4ai-command-r-v01-japanese-instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Aratako/c4ai-command-r-v01-japanese-instruct](https://huggingface.co/Aratako/c4ai-command-r-v01-japanese-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Aratako__c4ai-command-r-v01-japanese-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T15:26:41.930065](https://huggingface.co/datasets/open-llm-leaderboard/details_Aratako__c4ai-command-r-v01-japanese-instruct/blob/main/results_2024-04-06T15-26-41.930065.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6762099088456663,
"acc_stderr": 0.031218722032505204,
"acc_norm": 0.6788659428254314,
"acc_norm_stderr": 0.031843003332528694,
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.5100708052144302,
"mc2_stderr": 0.014817034256109527
},
"harness|arc:challenge|25": {
"acc": 0.6220136518771331,
"acc_stderr": 0.014169664520303098,
"acc_norm": 0.658703071672355,
"acc_norm_stderr": 0.013855831287497724
},
"harness|hellaswag|10": {
"acc": 0.6586337382991436,
"acc_stderr": 0.004731989816563666,
"acc_norm": 0.8562039434375622,
"acc_norm_stderr": 0.003501657107386699
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932262,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932262
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146268,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146268
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.025722097064388518,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.025722097064388518
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.030874145136562076,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.030874145136562076
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.026225919863629293,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.026225919863629293
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.02407869658063547,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.02407869658063547
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.02918571494985739,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.02918571494985739
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7310924369747899,
"acc_stderr": 0.028801392193631273,
"acc_norm": 0.7310924369747899,
"acc_norm_stderr": 0.028801392193631273
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553332,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553332
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8607594936708861,
"acc_stderr": 0.022535526352692705,
"acc_norm": 0.8607594936708861,
"acc_norm_stderr": 0.022535526352692705
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.02799153425851952,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.02799153425851952
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8760330578512396,
"acc_stderr": 0.03008309871603521,
"acc_norm": 0.8760330578512396,
"acc_norm_stderr": 0.03008309871603521
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8425925925925926,
"acc_stderr": 0.03520703990517963,
"acc_norm": 0.8425925925925926,
"acc_norm_stderr": 0.03520703990517963
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8159509202453987,
"acc_stderr": 0.030446777687971726,
"acc_norm": 0.8159509202453987,
"acc_norm_stderr": 0.030446777687971726
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.9017094017094017,
"acc_stderr": 0.019503444900757567,
"acc_norm": 0.9017094017094017,
"acc_norm_stderr": 0.019503444900757567
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8646232439335888,
"acc_stderr": 0.012234384586856495,
"acc_norm": 0.8646232439335888,
"acc_norm_stderr": 0.012234384586856495
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.02454761779480383,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.02454761779480383
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5441340782122905,
"acc_stderr": 0.016657229424586306,
"acc_norm": 0.5441340782122905,
"acc_norm_stderr": 0.016657229424586306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182651,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182651
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.024723861504771696,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.024723861504771696
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7561728395061729,
"acc_stderr": 0.023891879541959597,
"acc_norm": 0.7561728395061729,
"acc_norm_stderr": 0.023891879541959597
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5425531914893617,
"acc_stderr": 0.02971928127223685,
"acc_norm": 0.5425531914893617,
"acc_norm_stderr": 0.02971928127223685
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5469361147327249,
"acc_stderr": 0.012713845972358986,
"acc_norm": 0.5469361147327249,
"acc_norm_stderr": 0.012713845972358986
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.018120224251484577,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.018120224251484577
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.043502714429232425,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.043502714429232425
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7755102040816326,
"acc_stderr": 0.02671143055553842,
"acc_norm": 0.7755102040816326,
"acc_norm_stderr": 0.02671143055553842
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8706467661691543,
"acc_stderr": 0.02372983088101853,
"acc_norm": 0.8706467661691543,
"acc_norm_stderr": 0.02372983088101853
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896309,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896309
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160872,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160872
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3463892288861689,
"mc1_stderr": 0.01665699710912514,
"mc2": 0.5100708052144302,
"mc2_stderr": 0.014817034256109527
},
"harness|winogrande|5": {
"acc": 0.829518547750592,
"acc_stderr": 0.010569021122825916
},
"harness|gsm8k|5": {
"acc": 0.6004548900682335,
"acc_stderr": 0.013491660298815994
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ymoslem/SpokenWords-GA-EN-MTed | ymoslem | "2024-06-24T19:50:55Z" | 0 | 1 | [
"task_categories:automatic-speech-recognition",
"task_categories:text-to-speech",
"task_categories:translation",
"language:ga",
"language:en",
"license:mit",
"size_categories:10K<n<100K",
"format:parquet",
"modality:audio",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"automatic-speech-recognition",
"text-to-speech",
"translation"
] | "2024-04-06T15:31:19Z" | ---
dataset_info:
features:
- name: keyword
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: translation
dtype: string
splits:
- name: train
num_bytes: 350177561.125
num_examples: 10925
download_size: 133270458
dataset_size: 350177561.125
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- automatic-speech-recognition
- text-to-speech
- translation
language:
- ga
- en
size_categories:
- 10K<n<100K
---
# Dataset Card for Dataset Name
This is the Irish portion of the Spoken Words dataset (available at [MLCommons/ml_spoken_words](https://huggingface.co/datasets/MLCommons/ml_spoken_words)),
with merged splits “train”, “validation”, and “test”, augmented with machine translation.
The Irish sentences are automatically translated into English using Google Translation API.
The dataset includes approximately 3 hours and 2 minutes of audio (03:02:02), spoken by multiple narrators.
## Dataset Structure
```
Dataset({
features: ['keyword', 'audio', 'translation'],
num_rows: 10925
})
```
# How to load the dataset
```
from datasets import load_dataset
dataset = load_dataset("SpokenWords-GA-EN-MTed",
split="train",
trust_remote_code=True
)
```
## Citations
```
@inproceedings{mazumder2021multilingual,
title={Multilingual Spoken Words Corpus},
author={Mazumder, Mark and Chitlangia, Sharad and Banbury, Colby and Kang, Yiping and Ciro, Juan Manuel and Achorn, Keith and Galvez, Daniel and Sabini, Mark and Mattson, Peter and Kanter, David and others},
booktitle={Thirty-fifth Conference on Neural Information Processing Systems Datasets and Benchmarks Track (Round 2)},
year={2021}
}
```
```
@inproceedings{moslem2024leveraging,
title={Leveraging Synthetic Audio Data for End-to-End Low-Resource Speech Translation},
author={Moslem, Yasmin},
booktitle={Proceedings of the 2024 International Conference on Spoken Language Translation (IWSLT 2024)},
year={2024},
month={April}
}
``` |
hjawad367/ADE_20 | hjawad367 | "2024-04-09T09:06:30Z" | 0 | 0 | [
"license:mit",
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T15:40:10Z" | ---
license: mit
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype: image
splits:
- name: train
num_bytes: 210775007.0
num_examples: 5000
download_size: 210656746
dataset_size: 210775007.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
316usman/thematic2a_rr_embed | 316usman | "2024-04-06T15:46:46Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T15:46:41Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 53291241
num_examples: 84993
download_size: 18284689
dataset_size: 53291241
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Stereotypes-in-LLMs/hiring-analyses-baseline-uk | Stereotypes-in-LLMs | "2024-04-27T08:38:25Z" | 0 | 0 | [
"license:cc-by-4.0",
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T15:46:42Z" | ---
dataset_info:
features:
- name: candidate_id
dtype: string
- name: job_id
dtype: string
- name: CV
dtype: string
- name: Job Description
dtype: string
- name: Job Position
dtype: string
- name: lang
dtype: string
- name: protected_group
dtype: string
- name: protected_attr
dtype: string
- name: group_id
dtype: string
- name: decision
dtype: string
- name: feedback
dtype: string
- name: raw_ai_decision
dtype: string
splits:
- name: gender
num_bytes: 44520606
num_examples: 9000
- name: marital_status
num_bytes: 11187777
num_examples: 2250
- name: military_status
num_bytes: 11170804
num_examples: 2250
- name: religion
num_bytes: 20051123
num_examples: 4050
- name: name
num_bytes: 22263539
num_examples: 4500
- name: age
num_bytes: 13293198
num_examples: 2700
download_size: 9833292
dataset_size: 122487047
configs:
- config_name: default
data_files:
- split: gender
path: data/gender-*
- split: marital_status
path: data/marital_status-*
- split: military_status
path: data/military_status-*
- split: religion
path: data/religion-*
- split: name
path: data/name-*
- split: age
path: data/age-*
license: cc-by-4.0
--- |
316usman/thematic1d_rr | 316usman | "2024-04-06T16:11:48Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T16:08:21Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 82397997.40905319
num_examples: 129936
download_size: 29755891
dataset_size: 82397997.40905319
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
felipesampaio2010/alfredoadame | felipesampaio2010 | "2024-04-06T16:52:53Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-06T16:49:56Z" | ---
license: openrail
---
|
Hack90/experiment_one_viral_genomes_train_set | Hack90 | "2024-04-06T17:06:52Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T17:06:10Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
- name: sequence_quality
dtype: float64
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
splits:
- name: train
num_bytes: 2516022659
num_examples: 350149
download_size: 425683637
dataset_size: 2516022659
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hack90/experiment_one_viral_genomes_val_set | Hack90 | "2024-05-29T21:36:36Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T17:06:52Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: sequence
dtype: string
- name: name
dtype: string
- name: description
dtype: string
- name: features
dtype: int64
- name: seq_length
dtype: int64
- name: sequence_quality
dtype: float64
- name: text
dtype: string
- name: __index_level_0__
dtype: int64
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: 2D_Sequence
sequence:
sequence: float64
- name: 2D_Sequence_Scaled
sequence:
sequence: float64
- name: 2D_Sequence_Interpolated
sequence:
sequence: float64
splits:
- name: train
num_bytes: 4781181377
num_examples: 71306
download_size: 3008235596
dataset_size: 4781181377
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
316usman/thematic2c_rr_embed | 316usman | "2024-04-06T17:08:52Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T17:08:34Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 228710655
num_examples: 358897
download_size: 80867836
dataset_size: 228710655
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mikhail-panzo/processed_dutch | mikhail-panzo | "2024-04-06T17:42:51Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T17:41:24Z" | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
- name: speaker_embeddings
sequence: float32
splits:
- name: train
num_bytes: 2085625854.6812413
num_examples: 10209
- name: test
num_bytes: 231872401.31875882
num_examples: 1135
download_size: 2303025131
dataset_size: 2317498256.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AlexEgito/minhavoz | AlexEgito | "2024-04-06T18:54:12Z" | 0 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-04-06T18:50:35Z" | ---
license: openrail
---
|
ahmedesmail16/DataSet | ahmedesmail16 | "2024-04-06T19:48:01Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T19:18:11Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Erythromelal
'1': Guttate
'2': Inverse
'3': Nail
'4': Normal
'5': Plaque
'6': Pustular
splits:
- name: train
num_bytes: 11061741.0
num_examples: 327
- name: validation
num_bytes: 2156929.0
num_examples: 68
- name: test
num_bytes: 2684061.0
num_examples: 77
download_size: 15521335
dataset_size: 15902731.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
{'Erythromelal': 0, 'Guttate': 1, 'Inverse': 2, 'Nail': 3, 'Normal': 4, 'Plaque': 5, 'Pustular': 6}
|
weqweasdas/openchat_model0_data_with_rewards | weqweasdas | "2024-04-06T19:40:12Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T19:40:01Z" | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: type
dtype: string
- name: instances
list:
- name: prompt
dtype: string
- name: responses
sequence: string
- name: rewards
sequence: float64
splits:
- name: train
num_bytes: 164177578
num_examples: 1
download_size: 73760476
dataset_size: 164177578
---
# Dataset Card for "openchat_model0_data_with_rewards"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
iamkaikai/GAME-MAP-ART | iamkaikai | "2024-04-06T20:18:47Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T20:18:37Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 67838476.847
num_examples: 1101
download_size: 67936465
dataset_size: 67838476.847
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
srmisa/elsalvador-context | srmisa | "2024-04-07T12:40:48Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T20:55:52Z" | ---
dataset_info:
features:
- name: id
dtype: int64
- name: text
dtype: string
- name: title
dtype: string
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 24800833
num_examples: 3913
download_size: 14690698
dataset_size: 24800833
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_raincandy-u__Rain-7B-v0.2 | open-llm-leaderboard-old | "2024-04-06T21:33:48Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-06T21:33:28Z" | ---
pretty_name: Evaluation run of raincandy-u/Rain-7B-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [raincandy-u/Rain-7B-v0.2](https://huggingface.co/raincandy-u/Rain-7B-v0.2) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_raincandy-u__Rain-7B-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T21:31:21.479455](https://huggingface.co/datasets/open-llm-leaderboard/details_raincandy-u__Rain-7B-v0.2/blob/main/results_2024-04-06T21-31-21.479455.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6114222243511679,\n\
\ \"acc_stderr\": 0.03301366749354199,\n \"acc_norm\": 0.6157426651995731,\n\
\ \"acc_norm_stderr\": 0.03367191220334526,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.46437322203907233,\n\
\ \"mc2_stderr\": 0.014942663821308811\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.47952218430034127,\n \"acc_stderr\": 0.014599131353035007,\n\
\ \"acc_norm\": 0.515358361774744,\n \"acc_norm_stderr\": 0.014604496129394904\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.560246962756423,\n\
\ \"acc_stderr\": 0.004953426186069825,\n \"acc_norm\": 0.7511451902011551,\n\
\ \"acc_norm_stderr\": 0.0043146590346494\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6011560693641619,\n\
\ \"acc_stderr\": 0.0373362665538351,\n \"acc_norm\": 0.6011560693641619,\n\
\ \"acc_norm_stderr\": 0.0373362665538351\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6551724137931034,\n \"acc_stderr\": 0.03960933549451208,\n\
\ \"acc_norm\": 0.6551724137931034,\n \"acc_norm_stderr\": 0.03960933549451208\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212378,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212378\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959214,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959214\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.02833560973246336,\n \"acc_norm\"\
: 0.803030303030303,\n \"acc_norm_stderr\": 0.02833560973246336\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548047,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548047\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5820512820512821,\n \"acc_stderr\": 0.025007329882461217,\n\
\ \"acc_norm\": 0.5820512820512821,\n \"acc_norm_stderr\": 0.025007329882461217\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719198,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719198\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.01659525971039929,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.01659525971039929\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.03407632093854053,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.03407632093854053\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695053,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695053\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.046533331469736455,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.046533331469736455\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.68,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.015190473717037498,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.015190473717037498\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.02530525813187971,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.02530525813187971\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.376536312849162,\n\
\ \"acc_stderr\": 0.016204672385106606,\n \"acc_norm\": 0.376536312849162,\n\
\ \"acc_norm_stderr\": 0.016204672385106606\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145894,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145894\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6688102893890675,\n\
\ \"acc_stderr\": 0.02673062072800491,\n \"acc_norm\": 0.6688102893890675,\n\
\ \"acc_norm_stderr\": 0.02673062072800491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.02604176620271716,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.02604176620271716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.41134751773049644,\n \"acc_stderr\": 0.029354911159940978,\n \
\ \"acc_norm\": 0.41134751773049644,\n \"acc_norm_stderr\": 0.029354911159940978\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45045632333767927,\n\
\ \"acc_stderr\": 0.012707390438502346,\n \"acc_norm\": 0.45045632333767927,\n\
\ \"acc_norm_stderr\": 0.012707390438502346\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5808823529411765,\n \"acc_stderr\": 0.029972807170464626,\n\
\ \"acc_norm\": 0.5808823529411765,\n \"acc_norm_stderr\": 0.029972807170464626\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.576797385620915,\n \"acc_stderr\": 0.019987809769482067,\n \
\ \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.019987809769482067\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6363636363636364,\n\
\ \"acc_stderr\": 0.046075820907199756,\n \"acc_norm\": 0.6363636363636364,\n\
\ \"acc_norm_stderr\": 0.046075820907199756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6612244897959184,\n \"acc_stderr\": 0.030299506562154185,\n\
\ \"acc_norm\": 0.6612244897959184,\n \"acc_norm_stderr\": 0.030299506562154185\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.746268656716418,\n\
\ \"acc_stderr\": 0.03076944496729602,\n \"acc_norm\": 0.746268656716418,\n\
\ \"acc_norm_stderr\": 0.03076944496729602\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.46437322203907233,\n\
\ \"mc2_stderr\": 0.014942663821308811\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7071823204419889,\n \"acc_stderr\": 0.012789321118542607\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4874905231235785,\n \
\ \"acc_stderr\": 0.01376817361508787\n }\n}\n```"
repo_url: https://huggingface.co/raincandy-u/Rain-7B-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|arc:challenge|25_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|gsm8k|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hellaswag|10_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T21-31-21.479455.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T21-31-21.479455.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- '**/details_harness|winogrande|5_2024-04-06T21-31-21.479455.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T21-31-21.479455.parquet'
- config_name: results
data_files:
- split: 2024_04_06T21_31_21.479455
path:
- results_2024-04-06T21-31-21.479455.parquet
- split: latest
path:
- results_2024-04-06T21-31-21.479455.parquet
---
# Dataset Card for Evaluation run of raincandy-u/Rain-7B-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [raincandy-u/Rain-7B-v0.2](https://huggingface.co/raincandy-u/Rain-7B-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_raincandy-u__Rain-7B-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T21:31:21.479455](https://huggingface.co/datasets/open-llm-leaderboard/details_raincandy-u__Rain-7B-v0.2/blob/main/results_2024-04-06T21-31-21.479455.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6114222243511679,
"acc_stderr": 0.03301366749354199,
"acc_norm": 0.6157426651995731,
"acc_norm_stderr": 0.03367191220334526,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.46437322203907233,
"mc2_stderr": 0.014942663821308811
},
"harness|arc:challenge|25": {
"acc": 0.47952218430034127,
"acc_stderr": 0.014599131353035007,
"acc_norm": 0.515358361774744,
"acc_norm_stderr": 0.014604496129394904
},
"harness|hellaswag|10": {
"acc": 0.560246962756423,
"acc_stderr": 0.004953426186069825,
"acc_norm": 0.7511451902011551,
"acc_norm_stderr": 0.0043146590346494
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6011560693641619,
"acc_stderr": 0.0373362665538351,
"acc_norm": 0.6011560693641619,
"acc_norm_stderr": 0.0373362665538351
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6551724137931034,
"acc_stderr": 0.03960933549451208,
"acc_norm": 0.6551724137931034,
"acc_norm_stderr": 0.03960933549451208
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212378,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212378
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959214,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959214
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.02833560973246336,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.02833560973246336
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548047,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548047
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5820512820512821,
"acc_stderr": 0.025007329882461217,
"acc_norm": 0.5820512820512821,
"acc_norm_stderr": 0.025007329882461217
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719198,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719198
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.01659525971039929,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.01659525971039929
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.03407632093854053,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.03407632093854053
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695053,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695053
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.046533331469736455,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.046533331469736455
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.015190473717037498,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.015190473717037498
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.02530525813187971,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.02530525813187971
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.376536312849162,
"acc_stderr": 0.016204672385106606,
"acc_norm": 0.376536312849162,
"acc_norm_stderr": 0.016204672385106606
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145894,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145894
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6688102893890675,
"acc_stderr": 0.02673062072800491,
"acc_norm": 0.6688102893890675,
"acc_norm_stderr": 0.02673062072800491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.02604176620271716,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.02604176620271716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.41134751773049644,
"acc_stderr": 0.029354911159940978,
"acc_norm": 0.41134751773049644,
"acc_norm_stderr": 0.029354911159940978
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45045632333767927,
"acc_stderr": 0.012707390438502346,
"acc_norm": 0.45045632333767927,
"acc_norm_stderr": 0.012707390438502346
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5808823529411765,
"acc_stderr": 0.029972807170464626,
"acc_norm": 0.5808823529411765,
"acc_norm_stderr": 0.029972807170464626
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.019987809769482067,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.019987809769482067
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6363636363636364,
"acc_stderr": 0.046075820907199756,
"acc_norm": 0.6363636363636364,
"acc_norm_stderr": 0.046075820907199756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6612244897959184,
"acc_stderr": 0.030299506562154185,
"acc_norm": 0.6612244897959184,
"acc_norm_stderr": 0.030299506562154185
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.746268656716418,
"acc_stderr": 0.03076944496729602,
"acc_norm": 0.746268656716418,
"acc_norm_stderr": 0.03076944496729602
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.46437322203907233,
"mc2_stderr": 0.014942663821308811
},
"harness|winogrande|5": {
"acc": 0.7071823204419889,
"acc_stderr": 0.012789321118542607
},
"harness|gsm8k|5": {
"acc": 0.4874905231235785,
"acc_stderr": 0.01376817361508787
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
futureProofGlitch/Lectures-test-V1 | futureProofGlitch | "2024-04-06T21:51:31Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-06T21:50:57Z" | ---
dataset_info:
features:
- name: __index_level_0__
dtype: int64
- name: audio
struct:
- name: array
sequence:
sequence: float32
- name: sampling_rate
dtype: int64
- name: text
dtype: string
splits:
- name: train
num_bytes: 615250132
num_examples: 1881
- name: test
num_bytes: 191693480
num_examples: 627
download_size: 784670622
dataset_size: 806943612
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
316usman/thematic2d_rr_embed | 316usman | "2024-04-06T22:18:29Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T22:18:25Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
splits:
- name: train
num_bytes: 34665058
num_examples: 55374
download_size: 12625120
dataset_size: 34665058
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gsh3729/sw1 | gsh3729 | "2024-04-06T22:21:33Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T22:20:22Z" | ---
dataset_info:
features:
- name: filename
dtype: string
- name: tif
dtype: binary
- name: tfw
dtype: binary
splits:
- name: train
num_bytes: 25698
num_examples: 2
download_size: 27436
dataset_size: 25698
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_lex-hue__Delexa-7b | open-llm-leaderboard-old | "2024-04-07T00:26:18Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-06T22:29:56Z" | ---
pretty_name: Evaluation run of lex-hue/Delexa-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lex-hue/Delexa-7b](https://huggingface.co/lex-hue/Delexa-7b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lex-hue__Delexa-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T00:23:38.543914](https://huggingface.co/datasets/open-llm-leaderboard/details_lex-hue__Delexa-7b/blob/main/results_2024-04-07T00-23-38.543914.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6496468924225586,\n\
\ \"acc_stderr\": 0.03207144308728149,\n \"acc_norm\": 0.6509773836165138,\n\
\ \"acc_norm_stderr\": 0.0327145643645653,\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6217742992950922,\n\
\ \"mc2_stderr\": 0.015455929661783052\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6467576791808873,\n \"acc_stderr\": 0.013967822714840055,\n\
\ \"acc_norm\": 0.6825938566552902,\n \"acc_norm_stderr\": 0.013602239088038169\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6879107747460665,\n\
\ \"acc_stderr\": 0.004623990785158488,\n \"acc_norm\": 0.8650667197769368,\n\
\ \"acc_norm_stderr\": 0.003409540533249841\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569526,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569526\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.035868792800803406,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.035868792800803406\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.03599586301247077,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.03599586301247077\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5872340425531914,\n\
\ \"acc_stderr\": 0.03218471141400351,\n \"acc_norm\": 0.5872340425531914,\n\
\ \"acc_norm_stderr\": 0.03218471141400351\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.4824561403508772,\n \"acc_stderr\": 0.04700708033551038,\n\
\ \"acc_norm\": 0.4824561403508772,\n \"acc_norm_stderr\": 0.04700708033551038\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n \"\
acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5079365079365079,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.5079365079365079,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8,\n\
\ \"acc_stderr\": 0.02275520495954294,\n \"acc_norm\": 0.8,\n \
\ \"acc_norm_stderr\": 0.02275520495954294\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.02371088850197057,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.02371088850197057\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6974789915966386,\n \"acc_stderr\": 0.029837962388291946,\n\
\ \"acc_norm\": 0.6974789915966386,\n \"acc_norm_stderr\": 0.029837962388291946\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931796,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931796\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508766,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508766\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.04738975119274155,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.04738975119274155\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993452,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993452\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.02370309952525817,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.02370309952525817\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.016553287863116037,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.016553287863116037\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.02573885479781873,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.02573885479781873\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4645390070921986,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.4645390070921986,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4791395045632334,\n\
\ \"acc_stderr\": 0.012759117066518019,\n \"acc_norm\": 0.4791395045632334,\n\
\ \"acc_norm_stderr\": 0.012759117066518019\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7169117647058824,\n \"acc_stderr\": 0.027365861131513812,\n\
\ \"acc_norm\": 0.7169117647058824,\n \"acc_norm_stderr\": 0.027365861131513812\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6388888888888888,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.02954774168764004,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.02954774168764004\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4418604651162791,\n\
\ \"mc1_stderr\": 0.017384767478986218,\n \"mc2\": 0.6217742992950922,\n\
\ \"mc2_stderr\": 0.015455929661783052\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7916337805840569,\n \"acc_stderr\": 0.011414554399987729\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6421531463229719,\n \
\ \"acc_stderr\": 0.013204142536119947\n }\n}\n```"
repo_url: https://huggingface.co/lex-hue/Delexa-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|arc:challenge|25_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|arc:challenge|25_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|gsm8k|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|gsm8k|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hellaswag|10_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hellaswag|10_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T22-27-37.762517.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T00-23-38.543914.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T00-23-38.543914.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- '**/details_harness|winogrande|5_2024-04-06T22-27-37.762517.parquet'
- split: 2024_04_07T00_23_38.543914
path:
- '**/details_harness|winogrande|5_2024-04-07T00-23-38.543914.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T00-23-38.543914.parquet'
- config_name: results
data_files:
- split: 2024_04_06T22_27_37.762517
path:
- results_2024-04-06T22-27-37.762517.parquet
- split: 2024_04_07T00_23_38.543914
path:
- results_2024-04-07T00-23-38.543914.parquet
- split: latest
path:
- results_2024-04-07T00-23-38.543914.parquet
---
# Dataset Card for Evaluation run of lex-hue/Delexa-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lex-hue/Delexa-7b](https://huggingface.co/lex-hue/Delexa-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lex-hue__Delexa-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T00:23:38.543914](https://huggingface.co/datasets/open-llm-leaderboard/details_lex-hue__Delexa-7b/blob/main/results_2024-04-07T00-23-38.543914.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6496468924225586,
"acc_stderr": 0.03207144308728149,
"acc_norm": 0.6509773836165138,
"acc_norm_stderr": 0.0327145643645653,
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6217742992950922,
"mc2_stderr": 0.015455929661783052
},
"harness|arc:challenge|25": {
"acc": 0.6467576791808873,
"acc_stderr": 0.013967822714840055,
"acc_norm": 0.6825938566552902,
"acc_norm_stderr": 0.013602239088038169
},
"harness|hellaswag|10": {
"acc": 0.6879107747460665,
"acc_stderr": 0.004623990785158488,
"acc_norm": 0.8650667197769368,
"acc_norm_stderr": 0.003409540533249841
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569526,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569526
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.035868792800803406,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.035868792800803406
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5079365079365079,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.5079365079365079,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8,
"acc_stderr": 0.02275520495954294,
"acc_norm": 0.8,
"acc_norm_stderr": 0.02275520495954294
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.02371088850197057,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.02371088850197057
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6974789915966386,
"acc_stderr": 0.029837962388291946,
"acc_norm": 0.6974789915966386,
"acc_norm_stderr": 0.029837962388291946
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931796,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931796
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508766,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508766
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.04738975119274155,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.04738975119274155
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993452,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993452
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.02370309952525817,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.02370309952525817
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.016553287863116037,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.016553287863116037
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.02573885479781873,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.02573885479781873
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4645390070921986,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.4645390070921986,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4791395045632334,
"acc_stderr": 0.012759117066518019,
"acc_norm": 0.4791395045632334,
"acc_norm_stderr": 0.012759117066518019
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7169117647058824,
"acc_stderr": 0.027365861131513812,
"acc_norm": 0.7169117647058824,
"acc_norm_stderr": 0.027365861131513812
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.02954774168764004,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.02954774168764004
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4418604651162791,
"mc1_stderr": 0.017384767478986218,
"mc2": 0.6217742992950922,
"mc2_stderr": 0.015455929661783052
},
"harness|winogrande|5": {
"acc": 0.7916337805840569,
"acc_stderr": 0.011414554399987729
},
"harness|gsm8k|5": {
"acc": 0.6421531463229719,
"acc_stderr": 0.013204142536119947
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
JINIAC/ja_law_20240330_prefilter | JINIAC | "2024-04-06T23:08:23Z" | 0 | 0 | [
"license:cc-by-4.0",
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T22:46:54Z" | ---
license: cc-by-4.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1501734982
num_examples: 449041
download_size: 422291061
dataset_size: 1501734982
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JINIAC/aozorabunko_prefilter | JINIAC | "2024-04-06T23:09:11Z" | 0 | 0 | [
"license:cc-by-4.0",
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T22:47:33Z" | ---
license: cc-by-4.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 523975850
num_examples: 2194409
download_size: 295216795
dataset_size: 523975850
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
JINIAC/ja_wiki_20240301_prefilter | JINIAC | "2024-04-06T23:06:14Z" | 0 | 0 | [
"license:cc-by-sa-4.0",
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T22:48:11Z" | ---
license: cc-by-sa-4.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 673727719
num_examples: 248501
download_size: 411872163
dataset_size: 673727719
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
David19930/audio_dataset_wsp | David19930 | "2024-04-06T23:05:14Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:audio",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T22:52:36Z" | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: transcribe
dtype: string
splits:
- name: train
num_bytes: 1638963.0
num_examples: 87
download_size: 1634215
dataset_size: 1638963.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_mtgv__MobileLLaMA-1.4B-Base | open-llm-leaderboard-old | "2024-04-06T23:24:53Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-06T23:24:12Z" | ---
pretty_name: Evaluation run of mtgv/MobileLLaMA-1.4B-Base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mtgv/MobileLLaMA-1.4B-Base](https://huggingface.co/mtgv/MobileLLaMA-1.4B-Base)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mtgv__MobileLLaMA-1.4B-Base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-06T23:22:22.302402](https://huggingface.co/datasets/open-llm-leaderboard/details_mtgv__MobileLLaMA-1.4B-Base/blob/main/results_2024-04-06T23-22-22.302402.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2537476149311213,\n\
\ \"acc_stderr\": 0.03071237430286461,\n \"acc_norm\": 0.2549089996364952,\n\
\ \"acc_norm_stderr\": 0.03147631325439452,\n \"mc1\": 0.21664626682986537,\n\
\ \"mc1_stderr\": 0.014421468452506985,\n \"mc2\": 0.3481107244797803,\n\
\ \"mc2_stderr\": 0.013684586182211824\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.310580204778157,\n \"acc_stderr\": 0.013522292098053054,\n\
\ \"acc_norm\": 0.3438566552901024,\n \"acc_norm_stderr\": 0.013880644570156215\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42949611631149176,\n\
\ \"acc_stderr\": 0.004939925958728871,\n \"acc_norm\": 0.5629356701852221,\n\
\ \"acc_norm_stderr\": 0.004950095555964667\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.26666666666666666,\n\
\ \"acc_stderr\": 0.03820169914517905,\n \"acc_norm\": 0.26666666666666666,\n\
\ \"acc_norm_stderr\": 0.03820169914517905\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.22264150943396227,\n \"acc_stderr\": 0.025604233470899098,\n\
\ \"acc_norm\": 0.22264150943396227,\n \"acc_norm_stderr\": 0.025604233470899098\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.16,\n \"acc_stderr\": 0.0368452949177471,\n \"acc_norm\"\
: 0.16,\n \"acc_norm_stderr\": 0.0368452949177471\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.044084400227680794,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.044084400227680794\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749888,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749888\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\": 0.23,\n\
\ \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.31063829787234043,\n \"acc_stderr\": 0.03025123757921317,\n\
\ \"acc_norm\": 0.31063829787234043,\n \"acc_norm_stderr\": 0.03025123757921317\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.23684210526315788,\n\
\ \"acc_stderr\": 0.03999423879281336,\n \"acc_norm\": 0.23684210526315788,\n\
\ \"acc_norm_stderr\": 0.03999423879281336\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131184,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131184\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2857142857142857,\n \"acc_stderr\": 0.02326651221373057,\n \"\
acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02326651221373057\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.04073524322147125,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.04073524322147125\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.26129032258064516,\n \"acc_stderr\": 0.024993053397764815,\n \"\
acc_norm\": 0.26129032258064516,\n \"acc_norm_stderr\": 0.024993053397764815\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.24630541871921183,\n \"acc_stderr\": 0.03031509928561773,\n \"\
acc_norm\": 0.24630541871921183,\n \"acc_norm_stderr\": 0.03031509928561773\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23636363636363636,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.23636363636363636,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20725388601036268,\n \"acc_stderr\": 0.029252823291803613,\n\
\ \"acc_norm\": 0.20725388601036268,\n \"acc_norm_stderr\": 0.029252823291803613\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026924,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026924\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.22962962962962963,\n \"acc_stderr\": 0.025644108639267624,\n \
\ \"acc_norm\": 0.22962962962962963,\n \"acc_norm_stderr\": 0.025644108639267624\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19747899159663865,\n \"acc_stderr\": 0.02585916412205146,\n\
\ \"acc_norm\": 0.19747899159663865,\n \"acc_norm_stderr\": 0.02585916412205146\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804723,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804723\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.01792308766780305,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.01792308766780305\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046948,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046948\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604257,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604257\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34080717488789236,\n\
\ \"acc_stderr\": 0.031811497470553604,\n \"acc_norm\": 0.34080717488789236,\n\
\ \"acc_norm_stderr\": 0.031811497470553604\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2644628099173554,\n \"acc_stderr\": 0.04026187527591207,\n \"\
acc_norm\": 0.2644628099173554,\n \"acc_norm_stderr\": 0.04026187527591207\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2524271844660194,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.2524271844660194,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.02934311479809447,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.02934311479809447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2796934865900383,\n\
\ \"acc_stderr\": 0.016050792148036553,\n \"acc_norm\": 0.2796934865900383,\n\
\ \"acc_norm_stderr\": 0.016050792148036553\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.26011560693641617,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.26011560693641617,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2424581005586592,\n\
\ \"acc_stderr\": 0.014333522059217889,\n \"acc_norm\": 0.2424581005586592,\n\
\ \"acc_norm_stderr\": 0.014333522059217889\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20915032679738563,\n \"acc_stderr\": 0.023287685312334806,\n\
\ \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.023287685312334806\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24437299035369775,\n\
\ \"acc_stderr\": 0.024406162094668886,\n \"acc_norm\": 0.24437299035369775,\n\
\ \"acc_norm_stderr\": 0.024406162094668886\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.02409347123262133,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.02409347123262133\n \
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\"\
: 0.2553191489361702,\n \"acc_stderr\": 0.026011992930901996,\n \"\
acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930901996\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2405475880052151,\n\
\ \"acc_stderr\": 0.010916406735478949,\n \"acc_norm\": 0.2405475880052151,\n\
\ \"acc_norm_stderr\": 0.010916406735478949\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.37272727272727274,\n \"acc_stderr\": 0.04631381319425463,\n\
\ \"acc_norm\": 0.37272727272727274,\n \"acc_norm_stderr\": 0.04631381319425463\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.1836734693877551,\n\
\ \"acc_stderr\": 0.02478907133200765,\n \"acc_norm\": 0.1836734693877551,\n\
\ \"acc_norm_stderr\": 0.02478907133200765\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.21393034825870647,\n \"acc_stderr\": 0.028996909693328906,\n\
\ \"acc_norm\": 0.21393034825870647,\n \"acc_norm_stderr\": 0.028996909693328906\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370519,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370519\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.29239766081871343,\n\
\ \"acc_stderr\": 0.03488647713457922,\n \"acc_norm\": 0.29239766081871343,\n\
\ \"acc_norm_stderr\": 0.03488647713457922\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.21664626682986537,\n \"mc1_stderr\": 0.014421468452506985,\n\
\ \"mc2\": 0.3481107244797803,\n \"mc2_stderr\": 0.013684586182211824\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.5943172849250198,\n\
\ \"acc_stderr\": 0.013800206336014207\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.011372251705837756,\n \"acc_stderr\": 0.0029206661987887465\n\
\ }\n}\n```"
repo_url: https://huggingface.co/mtgv/MobileLLaMA-1.4B-Base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|arc:challenge|25_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|gsm8k|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hellaswag|10_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T23-22-22.302402.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-06T23-22-22.302402.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- '**/details_harness|winogrande|5_2024-04-06T23-22-22.302402.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-06T23-22-22.302402.parquet'
- config_name: results
data_files:
- split: 2024_04_06T23_22_22.302402
path:
- results_2024-04-06T23-22-22.302402.parquet
- split: latest
path:
- results_2024-04-06T23-22-22.302402.parquet
---
# Dataset Card for Evaluation run of mtgv/MobileLLaMA-1.4B-Base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mtgv/MobileLLaMA-1.4B-Base](https://huggingface.co/mtgv/MobileLLaMA-1.4B-Base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mtgv__MobileLLaMA-1.4B-Base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-06T23:22:22.302402](https://huggingface.co/datasets/open-llm-leaderboard/details_mtgv__MobileLLaMA-1.4B-Base/blob/main/results_2024-04-06T23-22-22.302402.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2537476149311213,
"acc_stderr": 0.03071237430286461,
"acc_norm": 0.2549089996364952,
"acc_norm_stderr": 0.03147631325439452,
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.3481107244797803,
"mc2_stderr": 0.013684586182211824
},
"harness|arc:challenge|25": {
"acc": 0.310580204778157,
"acc_stderr": 0.013522292098053054,
"acc_norm": 0.3438566552901024,
"acc_norm_stderr": 0.013880644570156215
},
"harness|hellaswag|10": {
"acc": 0.42949611631149176,
"acc_stderr": 0.004939925958728871,
"acc_norm": 0.5629356701852221,
"acc_norm_stderr": 0.004950095555964667
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.03820169914517905,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.03820169914517905
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.22264150943396227,
"acc_stderr": 0.025604233470899098,
"acc_norm": 0.22264150943396227,
"acc_norm_stderr": 0.025604233470899098
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.16,
"acc_stderr": 0.0368452949177471,
"acc_norm": 0.16,
"acc_norm_stderr": 0.0368452949177471
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749888,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749888
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.31063829787234043,
"acc_stderr": 0.03025123757921317,
"acc_norm": 0.31063829787234043,
"acc_norm_stderr": 0.03025123757921317
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.23684210526315788,
"acc_stderr": 0.03999423879281336,
"acc_norm": 0.23684210526315788,
"acc_norm_stderr": 0.03999423879281336
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131184,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131184
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02326651221373057,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02326651221373057
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.04073524322147125,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.04073524322147125
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.26129032258064516,
"acc_stderr": 0.024993053397764815,
"acc_norm": 0.26129032258064516,
"acc_norm_stderr": 0.024993053397764815
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.24630541871921183,
"acc_stderr": 0.03031509928561773,
"acc_norm": 0.24630541871921183,
"acc_norm_stderr": 0.03031509928561773
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23636363636363636,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.23636363636363636,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20725388601036268,
"acc_stderr": 0.029252823291803613,
"acc_norm": 0.20725388601036268,
"acc_norm_stderr": 0.029252823291803613
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026924,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026924
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.025644108639267624,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.025644108639267624
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19747899159663865,
"acc_stderr": 0.02585916412205146,
"acc_norm": 0.19747899159663865,
"acc_norm_stderr": 0.02585916412205146
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804723,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804723
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.01792308766780305,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.01792308766780305
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046948,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046948
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604257,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604257
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34080717488789236,
"acc_stderr": 0.031811497470553604,
"acc_norm": 0.34080717488789236,
"acc_norm_stderr": 0.031811497470553604
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2644628099173554,
"acc_stderr": 0.04026187527591207,
"acc_norm": 0.2644628099173554,
"acc_norm_stderr": 0.04026187527591207
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.2524271844660194,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.2524271844660194,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02934311479809447,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02934311479809447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2796934865900383,
"acc_stderr": 0.016050792148036553,
"acc_norm": 0.2796934865900383,
"acc_norm_stderr": 0.016050792148036553
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.26011560693641617,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.26011560693641617,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2424581005586592,
"acc_stderr": 0.014333522059217889,
"acc_norm": 0.2424581005586592,
"acc_norm_stderr": 0.014333522059217889
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24437299035369775,
"acc_stderr": 0.024406162094668886,
"acc_norm": 0.24437299035369775,
"acc_norm_stderr": 0.024406162094668886
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25,
"acc_stderr": 0.02409347123262133,
"acc_norm": 0.25,
"acc_norm_stderr": 0.02409347123262133
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930901996,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930901996
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2405475880052151,
"acc_stderr": 0.010916406735478949,
"acc_norm": 0.2405475880052151,
"acc_norm_stderr": 0.010916406735478949
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.37272727272727274,
"acc_stderr": 0.04631381319425463,
"acc_norm": 0.37272727272727274,
"acc_norm_stderr": 0.04631381319425463
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1836734693877551,
"acc_stderr": 0.02478907133200765,
"acc_norm": 0.1836734693877551,
"acc_norm_stderr": 0.02478907133200765
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21393034825870647,
"acc_stderr": 0.028996909693328906,
"acc_norm": 0.21393034825870647,
"acc_norm_stderr": 0.028996909693328906
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.03488647713457922,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.03488647713457922
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21664626682986537,
"mc1_stderr": 0.014421468452506985,
"mc2": 0.3481107244797803,
"mc2_stderr": 0.013684586182211824
},
"harness|winogrande|5": {
"acc": 0.5943172849250198,
"acc_stderr": 0.013800206336014207
},
"harness|gsm8k|5": {
"acc": 0.011372251705837756,
"acc_stderr": 0.0029206661987887465
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
rlinares2/chatbot_arena_embeddings_adav2 | rlinares2 | "2024-04-06T23:51:20Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T23:41:55Z" | ---
dataset_info:
features:
- name: question_embedding
sequence: float64
- name: answer_embeddings_a
sequence: float64
- name: answer_embeddings_b
sequence: float64
splits:
- name: train
num_bytes: 36876000
num_examples: 1000
download_size: 27503489
dataset_size: 36876000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
rlinares2/chatbot_arena_embeddings_adav3 | rlinares2 | "2024-04-07T17:26:25Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-06T23:57:45Z" | ---
dataset_info:
features:
- name: question_embedding
sequence: float64
- name: answer_embeddings_a
sequence: float64
- name: answer_embeddings_b
sequence: float64
splits:
- name: train
num_bytes: 885024000
num_examples: 24000
download_size: 647518122
dataset_size: 885024000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
williamroque/narradorinsta | williamroque | "2024-04-07T00:31:03Z" | 0 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-04-07T00:31:03Z" | ---
license: openrail
---
|
diwank/llmlingua-compressed-text | diwank | "2024-04-08T01:02:47Z" | 0 | 2 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T01:54:44Z" | ---
dataset_info:
features:
- name: token_counts
sequence: int64
- name: original
dtype: string
- name: compressed
dtype: string
splits:
- name: train
num_bytes: 103018912
num_examples: 150908
- name: test
num_bytes: 49074430
num_examples: 71440
download_size: 92752725
dataset_size: 152093342
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_Kukedlc__NeuralSynthesis-7B-v0.3 | open-llm-leaderboard-old | "2024-04-07T01:55:42Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T01:55:20Z" | ---
pretty_name: Evaluation run of Kukedlc/NeuralSynthesis-7B-v0.3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kukedlc/NeuralSynthesis-7B-v0.3](https://huggingface.co/Kukedlc/NeuralSynthesis-7B-v0.3)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7B-v0.3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T01:52:59.925688](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7B-v0.3/blob/main/results_2024-04-07T01-52-59.925688.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6504587057421836,\n\
\ \"acc_stderr\": 0.032086695538193204,\n \"acc_norm\": 0.6493227360939098,\n\
\ \"acc_norm_stderr\": 0.03276436295683278,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7812784766032873,\n\
\ \"mc2_stderr\": 0.013665362821405602\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.726962457337884,\n \"acc_norm_stderr\": 0.013019332762635753\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7177853017327226,\n\
\ \"acc_stderr\": 0.0044915745394418834,\n \"acc_norm\": 0.8917546305516829,\n\
\ \"acc_norm_stderr\": 0.0031005509089161993\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926924,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926924\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4335195530726257,\n\
\ \"acc_stderr\": 0.01657402721951763,\n \"acc_norm\": 0.4335195530726257,\n\
\ \"acc_norm_stderr\": 0.01657402721951763\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967284,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967284\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781752,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069446,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069446\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7812784766032873,\n\
\ \"mc2_stderr\": 0.013665362821405602\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7088703563305534,\n \
\ \"acc_stderr\": 0.012513215297888463\n }\n}\n```"
repo_url: https://huggingface.co/Kukedlc/NeuralSynthesis-7B-v0.3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|arc:challenge|25_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|gsm8k|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hellaswag|10_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T01-52-59.925688.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T01-52-59.925688.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- '**/details_harness|winogrande|5_2024-04-07T01-52-59.925688.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T01-52-59.925688.parquet'
- config_name: results
data_files:
- split: 2024_04_07T01_52_59.925688
path:
- results_2024-04-07T01-52-59.925688.parquet
- split: latest
path:
- results_2024-04-07T01-52-59.925688.parquet
---
# Dataset Card for Evaluation run of Kukedlc/NeuralSynthesis-7B-v0.3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kukedlc/NeuralSynthesis-7B-v0.3](https://huggingface.co/Kukedlc/NeuralSynthesis-7B-v0.3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7B-v0.3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T01:52:59.925688](https://huggingface.co/datasets/open-llm-leaderboard/details_Kukedlc__NeuralSynthesis-7B-v0.3/blob/main/results_2024-04-07T01-52-59.925688.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6504587057421836,
"acc_stderr": 0.032086695538193204,
"acc_norm": 0.6493227360939098,
"acc_norm_stderr": 0.03276436295683278,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7812784766032873,
"mc2_stderr": 0.013665362821405602
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.726962457337884,
"acc_norm_stderr": 0.013019332762635753
},
"harness|hellaswag|10": {
"acc": 0.7177853017327226,
"acc_stderr": 0.0044915745394418834,
"acc_norm": 0.8917546305516829,
"acc_norm_stderr": 0.0031005509089161993
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926924,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926924
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4335195530726257,
"acc_stderr": 0.01657402721951763,
"acc_norm": 0.4335195530726257,
"acc_norm_stderr": 0.01657402721951763
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967284,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967284
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781752,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069446,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069446
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7812784766032873,
"mc2_stderr": 0.013665362821405602
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.7088703563305534,
"acc_stderr": 0.012513215297888463
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Isotonic__Mixnueza-Chat-6x32M-MoE | open-llm-leaderboard-old | "2024-04-07T02:12:33Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T02:11:48Z" | ---
pretty_name: Evaluation run of Isotonic/Mixnueza-Chat-6x32M-MoE
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Isotonic/Mixnueza-Chat-6x32M-MoE](https://huggingface.co/Isotonic/Mixnueza-Chat-6x32M-MoE)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T02:09:55.470077](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE/blob/main/results_2024-04-07T02-09-55.470077.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25475467189344064,\n\
\ \"acc_stderr\": 0.030639762090785793,\n \"acc_norm\": 0.2552581810114775,\n\
\ \"acc_norm_stderr\": 0.03144935013039305,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.015298077509485083,\n \"mc2\": 0.4727026528122458,\n\
\ \"mc2_stderr\": 0.015699277111857743\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.18088737201365188,\n \"acc_stderr\": 0.011248574467407034,\n\
\ \"acc_norm\": 0.20392491467576793,\n \"acc_norm_stderr\": 0.011774262478702256\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2629954192391954,\n\
\ \"acc_stderr\": 0.004393601887506585,\n \"acc_norm\": 0.26528579964150567,\n\
\ \"acc_norm_stderr\": 0.004405829993258718\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768081,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768081\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.03633384414073461,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.03633384414073461\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210325,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210325\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.2,\n\
\ \"acc_stderr\": 0.040201512610368445,\n \"acc_norm\": 0.2,\n \
\ \"acc_norm_stderr\": 0.040201512610368445\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21509433962264152,\n \"acc_stderr\": 0.025288394502891356,\n\
\ \"acc_norm\": 0.21509433962264152,\n \"acc_norm_stderr\": 0.025288394502891356\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847415,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847415\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198816,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198816\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237654,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237654\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.251063829787234,\n \"acc_stderr\": 0.028346963777162452,\n\
\ \"acc_norm\": 0.251063829787234,\n \"acc_norm_stderr\": 0.028346963777162452\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.19298245614035087,\n\
\ \"acc_stderr\": 0.037124548537213684,\n \"acc_norm\": 0.19298245614035087,\n\
\ \"acc_norm_stderr\": 0.037124548537213684\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21212121212121213,\n \"acc_stderr\": 0.03192271569548299,\n\
\ \"acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.03192271569548299\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27461139896373055,\n \"acc_stderr\": 0.03221024508041154,\n\
\ \"acc_norm\": 0.27461139896373055,\n \"acc_norm_stderr\": 0.03221024508041154\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3564102564102564,\n \"acc_stderr\": 0.024283140529467295,\n\
\ \"acc_norm\": 0.3564102564102564,\n \"acc_norm_stderr\": 0.024283140529467295\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145668,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145668\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3025210084033613,\n \"acc_stderr\": 0.02983796238829193,\n \
\ \"acc_norm\": 0.3025210084033613,\n \"acc_norm_stderr\": 0.02983796238829193\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22752293577981653,\n \"acc_stderr\": 0.017974463578776502,\n \"\
acc_norm\": 0.22752293577981653,\n \"acc_norm_stderr\": 0.017974463578776502\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.22058823529411764,\n\
\ \"acc_stderr\": 0.02910225438967409,\n \"acc_norm\": 0.22058823529411764,\n\
\ \"acc_norm_stderr\": 0.02910225438967409\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.27802690582959644,\n\
\ \"acc_stderr\": 0.030069584874494047,\n \"acc_norm\": 0.27802690582959644,\n\
\ \"acc_norm_stderr\": 0.030069584874494047\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2727272727272727,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.18518518518518517,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.18518518518518517,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3128834355828221,\n \"acc_stderr\": 0.036429145782924055,\n\
\ \"acc_norm\": 0.3128834355828221,\n \"acc_norm_stderr\": 0.036429145782924055\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.19642857142857142,\n\
\ \"acc_stderr\": 0.03770970049347018,\n \"acc_norm\": 0.19642857142857142,\n\
\ \"acc_norm_stderr\": 0.03770970049347018\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2264957264957265,\n\
\ \"acc_stderr\": 0.027421007295392926,\n \"acc_norm\": 0.2264957264957265,\n\
\ \"acc_norm_stderr\": 0.027421007295392926\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2848020434227331,\n\
\ \"acc_stderr\": 0.01613917409652258,\n \"acc_norm\": 0.2848020434227331,\n\
\ \"acc_norm_stderr\": 0.01613917409652258\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.22832369942196531,\n \"acc_stderr\": 0.022598703804321635,\n\
\ \"acc_norm\": 0.22832369942196531,\n \"acc_norm_stderr\": 0.022598703804321635\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.28938906752411575,\n\
\ \"acc_stderr\": 0.025755865922632924,\n \"acc_norm\": 0.28938906752411575,\n\
\ \"acc_norm_stderr\": 0.025755865922632924\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005723,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005723\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843007,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843007\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23728813559322035,\n\
\ \"acc_stderr\": 0.010865436690780264,\n \"acc_norm\": 0.23728813559322035,\n\
\ \"acc_norm_stderr\": 0.010865436690780264\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.016684820929148587,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.016684820929148587\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.038950910157241364,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.038950910157241364\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.23265306122448978,\n \"acc_stderr\": 0.02704925791589618,\n\
\ \"acc_norm\": 0.23265306122448978,\n \"acc_norm_stderr\": 0.02704925791589618\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.036643147772880864,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.036643147772880864\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.015298077509485083,\n \"mc2\": 0.4727026528122458,\n\
\ \"mc2_stderr\": 0.015699277111857743\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.505130228887135,\n \"acc_stderr\": 0.01405174596179052\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Isotonic/Mixnueza-Chat-6x32M-MoE
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|arc:challenge|25_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|gsm8k|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hellaswag|10_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T02-09-55.470077.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- '**/details_harness|winogrande|5_2024-04-07T02-09-55.470077.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T02-09-55.470077.parquet'
- config_name: results
data_files:
- split: 2024_04_07T02_09_55.470077
path:
- results_2024-04-07T02-09-55.470077.parquet
- split: latest
path:
- results_2024-04-07T02-09-55.470077.parquet
---
# Dataset Card for Evaluation run of Isotonic/Mixnueza-Chat-6x32M-MoE
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Isotonic/Mixnueza-Chat-6x32M-MoE](https://huggingface.co/Isotonic/Mixnueza-Chat-6x32M-MoE) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T02:09:55.470077](https://huggingface.co/datasets/open-llm-leaderboard/details_Isotonic__Mixnueza-Chat-6x32M-MoE/blob/main/results_2024-04-07T02-09-55.470077.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25475467189344064,
"acc_stderr": 0.030639762090785793,
"acc_norm": 0.2552581810114775,
"acc_norm_stderr": 0.03144935013039305,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485083,
"mc2": 0.4727026528122458,
"mc2_stderr": 0.015699277111857743
},
"harness|arc:challenge|25": {
"acc": 0.18088737201365188,
"acc_stderr": 0.011248574467407034,
"acc_norm": 0.20392491467576793,
"acc_norm_stderr": 0.011774262478702256
},
"harness|hellaswag|10": {
"acc": 0.2629954192391954,
"acc_stderr": 0.004393601887506585,
"acc_norm": 0.26528579964150567,
"acc_norm_stderr": 0.004405829993258718
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768081,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768081
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.03633384414073461,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.03633384414073461
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210325,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210325
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.2,
"acc_stderr": 0.040201512610368445,
"acc_norm": 0.2,
"acc_norm_stderr": 0.040201512610368445
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21509433962264152,
"acc_stderr": 0.025288394502891356,
"acc_norm": 0.21509433962264152,
"acc_norm_stderr": 0.025288394502891356
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847415,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847415
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198816,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198816
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237654,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237654
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.251063829787234,
"acc_stderr": 0.028346963777162452,
"acc_norm": 0.251063829787234,
"acc_norm_stderr": 0.028346963777162452
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.19298245614035087,
"acc_stderr": 0.037124548537213684,
"acc_norm": 0.19298245614035087,
"acc_norm_stderr": 0.037124548537213684
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.03192271569548299,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.03192271569548299
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27461139896373055,
"acc_stderr": 0.03221024508041154,
"acc_norm": 0.27461139896373055,
"acc_norm_stderr": 0.03221024508041154
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3564102564102564,
"acc_stderr": 0.024283140529467295,
"acc_norm": 0.3564102564102564,
"acc_norm_stderr": 0.024283140529467295
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145668,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145668
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3025210084033613,
"acc_stderr": 0.02983796238829193,
"acc_norm": 0.3025210084033613,
"acc_norm_stderr": 0.02983796238829193
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22752293577981653,
"acc_stderr": 0.017974463578776502,
"acc_norm": 0.22752293577981653,
"acc_norm_stderr": 0.017974463578776502
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.02910225438967409,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.02910225438967409
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.27802690582959644,
"acc_stderr": 0.030069584874494047,
"acc_norm": 0.27802690582959644,
"acc_norm_stderr": 0.030069584874494047
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3128834355828221,
"acc_stderr": 0.036429145782924055,
"acc_norm": 0.3128834355828221,
"acc_norm_stderr": 0.036429145782924055
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.19642857142857142,
"acc_stderr": 0.03770970049347018,
"acc_norm": 0.19642857142857142,
"acc_norm_stderr": 0.03770970049347018
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2264957264957265,
"acc_stderr": 0.027421007295392926,
"acc_norm": 0.2264957264957265,
"acc_norm_stderr": 0.027421007295392926
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2848020434227331,
"acc_stderr": 0.01613917409652258,
"acc_norm": 0.2848020434227331,
"acc_norm_stderr": 0.01613917409652258
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.22832369942196531,
"acc_stderr": 0.022598703804321635,
"acc_norm": 0.22832369942196531,
"acc_norm_stderr": 0.022598703804321635
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.28938906752411575,
"acc_stderr": 0.025755865922632924,
"acc_norm": 0.28938906752411575,
"acc_norm_stderr": 0.025755865922632924
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005723,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005723
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843007,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843007
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23728813559322035,
"acc_stderr": 0.010865436690780264,
"acc_norm": 0.23728813559322035,
"acc_norm_stderr": 0.010865436690780264
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.016684820929148587,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.016684820929148587
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.038950910157241364,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.038950910157241364
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.23265306122448978,
"acc_stderr": 0.02704925791589618,
"acc_norm": 0.23265306122448978,
"acc_norm_stderr": 0.02704925791589618
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.036643147772880864,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.036643147772880864
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.015298077509485083,
"mc2": 0.4727026528122458,
"mc2_stderr": 0.015699277111857743
},
"harness|winogrande|5": {
"acc": 0.505130228887135,
"acc_stderr": 0.01405174596179052
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
nthakur/miracl-raft-instruct | nthakur | "2024-04-11T21:32:11Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T02:24:14Z" | ---
dataset_info:
- config_name: ar
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 10915524
num_examples: 3128
download_size: 4623442
dataset_size: 10915524
- config_name: bn
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 9162406
num_examples: 1508
download_size: 3137944
dataset_size: 9162406
- config_name: en
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 6462721
num_examples: 2108
download_size: 3293882
dataset_size: 6462721
- config_name: es
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 7719932
num_examples: 1971
download_size: 4085416
dataset_size: 7719932
- config_name: fa
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 6837240
num_examples: 1907
download_size: 2794448
dataset_size: 6837240
- config_name: fi
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 4060508
num_examples: 1852
download_size: 1976822
dataset_size: 4060508
- config_name: fr
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 2840468
num_examples: 1057
download_size: 1413994
dataset_size: 2840468
- config_name: hi
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 5778413
num_examples: 1099
download_size: 2006964
dataset_size: 5778413
- config_name: id
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 11111390
num_examples: 3392
download_size: 5470039
dataset_size: 11111390
- config_name: ja
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 8098770
num_examples: 2988
download_size: 3921802
dataset_size: 8098770
- config_name: ko
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 1525298
num_examples: 587
download_size: 736949
dataset_size: 1525298
- config_name: ru
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 15838835
num_examples: 4085
download_size: 7121760
dataset_size: 15838835
- config_name: sw
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 1114154
num_examples: 616
download_size: 441880
dataset_size: 1114154
- config_name: te
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 4083245
num_examples: 1003
download_size: 1294119
dataset_size: 4083245
- config_name: th
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 11672646
num_examples: 2556
download_size: 4007556
dataset_size: 11672646
- config_name: zh
features:
- name: output
list:
- name: model
dtype: string
- name: output
dtype: string
- name: prompt
dtype: string
- name: query_id
dtype: string
- name: doc_ids
sequence: string
- name: positive_ids
sequence: string
- name: negative_ids
sequence: 'null'
splits:
- name: train
num_bytes: 2469288
num_examples: 1029
download_size: 1362216
dataset_size: 2469288
configs:
- config_name: ar
data_files:
- split: train
path: ar/train-*
- config_name: bn
data_files:
- split: train
path: bn/train-*
- config_name: en
data_files:
- split: train
path: en/train-*
- config_name: es
data_files:
- split: train
path: es/train-*
- config_name: fa
data_files:
- split: train
path: fa/train-*
- config_name: fi
data_files:
- split: train
path: fi/train-*
- config_name: fr
data_files:
- split: train
path: fr/train-*
- config_name: hi
data_files:
- split: train
path: hi/train-*
- config_name: id
data_files:
- split: train
path: id/train-*
- config_name: ja
data_files:
- split: train
path: ja/train-*
- config_name: ko
data_files:
- split: train
path: ko/train-*
- config_name: ru
data_files:
- split: train
path: ru/train-*
- config_name: sw
data_files:
- split: train
path: sw/train-*
- config_name: te
data_files:
- split: train
path: te/train-*
- config_name: th
data_files:
- split: train
path: th/train-*
- config_name: zh
data_files:
- split: train
path: zh/train-*
---
# Dataset Card for "miracl-raft-instruct"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
SWHL/text_det_test_dataset | SWHL | "2024-04-09T02:11:10Z" | 0 | 1 | [
"language:zh",
"language:en",
"license:apache-2.0",
"size_categories:n<1K",
"modality:image",
"region:us"
] | null | "2024-04-07T03:00:41Z" | ---
license: apache-2.0
language:
- zh
- en
size_categories:
- n<1K
---
## 文本检测测试集
### 数据集简介
- 该测试集包括卡证类、文档类和自然场景三大类。其中卡证类有82张,文档类有75张,自然场景类有55张。
- 该数据集可以结合[文本检测指标评测库-TextDetMetric](https://github.com/SWHL/TextDetMetric)使用,快速评测各种文本检测算法。
- **关于该数据集,欢迎小伙伴贡献更多数据呦!有任何想法,可以前往[issue](https://github.com/SWHL/TextDetMetric/issues)讨论。**
### 数据集支持的任务
可用于自定义数据集下的模型验证和性能评估等。
#### 数据集加载方式
```python
from datasets import load_dataset
dataset = load_dataset("SWHL/text_det_test_dataset")
test_data = dataset['test']
print(test_data)
```
### 数据集生成的相关信息
#### 原始数据
数据来源于网络,如侵删。
#### 数据集标注
- 数据集使用[**labelme**](https://github.com/wkentaro/labelme)标注,json格式差别在于`imagePath`字段改写为了`file_name`字段。其余相同。
- 数据集标注为`json`格式,示例如下:
```json
{
"version": "4.5.6",
"flags": {},
"shapes": [
{
"label": "",
"points": [
[
486.0,
751.0
],
[
588.0,
751.0
],
[
588.0,
779.0
],
[
486.0,
779.0
]
],
"group_id": null,
"shape_type": "polygon",
"flags": {}
},
{
"label": "",
"points": [
[
385.1382113821138,
686.6178861788618
],
[
592.4552845528456,
688.2439024390244
],
[
589.2032520325204,
739.4634146341464
],
[
386.7642276422764,
737.0243902439024
]
],
"group_id": null,
"shape_type": "polygon",
"flags": {}
}
],
"file_name": "19.png",
"imageData": "xxxxxx",
"imageHeight": 784,
"imageWidth": 613
}
``` |
viarias/remote_sensing_2018_weedmap | viarias | "2024-04-07T03:52:47Z" | 0 | 0 | [
"task_categories:image-segmentation",
"language:en",
"license:apache-2.0",
"size_categories:1K<n<10K",
"region:us"
] | [
"image-segmentation"
] | "2024-04-07T03:19:59Z" | ---
dataset_info:
- config_name: red_edge
features:
- name: B
dtype: image
- name: CIR
dtype: image
- name: G
dtype: image
- name: NDVI
dtype: image
- name: NIR
dtype: image
- name: R
dtype: image
- name: RE
dtype: image
- name: RGB
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 1180504
num_examples: 766
- name: test
num_bytes: 314394
num_examples: 204
download_size: 637901163
dataset_size: 1494898
- config_name: sequoia
features:
- name: CIR
dtype: image
- name: G
dtype: image
- name: NDVI
dtype: image
- name: NIR
dtype: image
- name: R
dtype: image
- name: RE
dtype: image
- name: annotation
dtype: image
splits:
- name: train
num_bytes: 515690
num_examples: 428
- name: test
num_bytes: 327726
num_examples: 272
download_size: 444145925
dataset_size: 843416
license: apache-2.0
task_categories:
- image-segmentation
language:
- en
size_categories:
- 1K<n<10K
--- |
open-llm-leaderboard-old/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9 | open-llm-leaderboard-old | "2024-04-07T03:34:21Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T03:34:00Z" | ---
pretty_name: Evaluation run of jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T03:31:40.171262](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9/blob/main/results_2024-04-07T03-31-40.171262.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6559143487632242,\n\
\ \"acc_stderr\": 0.03200996549462357,\n \"acc_norm\": 0.6554287821986152,\n\
\ \"acc_norm_stderr\": 0.032675253303136656,\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.017193835812093886,\n \"mc2\": 0.7482652601699259,\n\
\ \"mc2_stderr\": 0.014273429873734122\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7081911262798635,\n \"acc_stderr\": 0.01328452529240351,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7230631348336984,\n\
\ \"acc_stderr\": 0.00446570481089354,\n \"acc_norm\": 0.887572196773551,\n\
\ \"acc_norm_stderr\": 0.003152464637757645\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4827586206896552,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.02450980392156862,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.02450980392156862\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4659217877094972,\n\
\ \"acc_stderr\": 0.016683615837486867,\n \"acc_norm\": 0.4659217877094972,\n\
\ \"acc_norm_stderr\": 0.016683615837486867\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.025839898334877983,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.025839898334877983\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5035460992907801,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.5035460992907801,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.01275616194252337,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.01275616194252337\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.01897542792050721,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.01897542792050721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699121,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699121\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.017193835812093886,\n \"mc2\": 0.7482652601699259,\n\
\ \"mc2_stderr\": 0.014273429873734122\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8358326756116812,\n \"acc_stderr\": 0.010410849775222782\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.012714401009923647\n }\n}\n```"
repo_url: https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|arc:challenge|25_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|gsm8k|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hellaswag|10_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T03-31-40.171262.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- '**/details_harness|winogrande|5_2024-04-07T03-31-40.171262.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T03-31-40.171262.parquet'
- config_name: results
data_files:
- split: 2024_04_07T03_31_40.171262
path:
- results_2024-04-07T03-31-40.171262.parquet
- split: latest
path:
- results_2024-04-07T03-31-40.171262.parquet
---
# Dataset Card for Evaluation run of jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9](https://huggingface.co/jsfs11/MixtureofMerges-MoE-2x7b-SLERPv0.9) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T03:31:40.171262](https://huggingface.co/datasets/open-llm-leaderboard/details_jsfs11__MixtureofMerges-MoE-2x7b-SLERPv0.9/blob/main/results_2024-04-07T03-31-40.171262.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6559143487632242,
"acc_stderr": 0.03200996549462357,
"acc_norm": 0.6554287821986152,
"acc_norm_stderr": 0.032675253303136656,
"mc1": 0.5936352509179926,
"mc1_stderr": 0.017193835812093886,
"mc2": 0.7482652601699259,
"mc2_stderr": 0.014273429873734122
},
"harness|arc:challenge|25": {
"acc": 0.7081911262798635,
"acc_stderr": 0.01328452529240351,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710695
},
"harness|hellaswag|10": {
"acc": 0.7230631348336984,
"acc_stderr": 0.00446570481089354,
"acc_norm": 0.887572196773551,
"acc_norm_stderr": 0.003152464637757645
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.02450980392156862,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.02450980392156862
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4659217877094972,
"acc_stderr": 0.016683615837486867,
"acc_norm": 0.4659217877094972,
"acc_norm_stderr": 0.016683615837486867
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.025839898334877983,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.025839898334877983
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5035460992907801,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.5035460992907801,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.01275616194252337,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.01275616194252337
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.01897542792050721,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.01897542792050721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699121,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699121
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5936352509179926,
"mc1_stderr": 0.017193835812093886,
"mc2": 0.7482652601699259,
"mc2_stderr": 0.014273429873734122
},
"harness|winogrande|5": {
"acc": 0.8358326756116812,
"acc_stderr": 0.010410849775222782
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923647
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
alkav/guanaco-llama2-1k | alkav | "2024-04-07T04:34:02Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T04:34:00Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_TeeZee__NEBULA-XB-v1.0_SFT_2_epoch | open-llm-leaderboard-old | "2024-04-07T04:57:37Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T04:57:16Z" | ---
pretty_name: Evaluation run of TeeZee/NEBULA-XB-v1.0_SFT_2_epoch
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/NEBULA-XB-v1.0_SFT_2_epoch](https://huggingface.co/TeeZee/NEBULA-XB-v1.0_SFT_2_epoch)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0_SFT_2_epoch\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T04:54:59.636887](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0_SFT_2_epoch/blob/main/results_2024-04-07T04-54-59.636887.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6452862614843039,\n\
\ \"acc_stderr\": 0.03162939989987327,\n \"acc_norm\": 0.6570693455480727,\n\
\ \"acc_norm_stderr\": 0.03246967905122735,\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5205616644281855,\n\
\ \"mc2_stderr\": 0.014938012326386271\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097862,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.01410457836649189\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6573391754630552,\n\
\ \"acc_stderr\": 0.004736292355716397,\n \"acc_norm\": 0.8507269468233419,\n\
\ \"acc_norm_stderr\": 0.003556291232050353\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6763005780346821,\n\
\ \"acc_stderr\": 0.03567603799639172,\n \"acc_norm\": 0.6763005780346821,\n\
\ \"acc_norm_stderr\": 0.03567603799639172\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5872340425531914,\n \"acc_stderr\": 0.03218471141400351,\n\
\ \"acc_norm\": 0.5872340425531914,\n \"acc_norm_stderr\": 0.03218471141400351\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4576719576719577,\n \"acc_stderr\": 0.02565886886205833,\n \"\
acc_norm\": 0.4576719576719577,\n \"acc_norm_stderr\": 0.02565886886205833\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7741935483870968,\n\
\ \"acc_stderr\": 0.023785577884181012,\n \"acc_norm\": 0.7741935483870968,\n\
\ \"acc_norm_stderr\": 0.023785577884181012\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8121212121212121,\n \"acc_stderr\": 0.03050193405942914,\n\
\ \"acc_norm\": 0.8121212121212121,\n \"acc_norm_stderr\": 0.03050193405942914\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8383838383838383,\n \"acc_stderr\": 0.026225919863629283,\n \"\
acc_norm\": 0.8383838383838383,\n \"acc_norm_stderr\": 0.026225919863629283\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289708,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289708\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37407407407407406,\n \"acc_stderr\": 0.029502861128955286,\n \
\ \"acc_norm\": 0.37407407407407406,\n \"acc_norm_stderr\": 0.029502861128955286\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7226890756302521,\n \"acc_stderr\": 0.029079374539480007,\n\
\ \"acc_norm\": 0.7226890756302521,\n \"acc_norm_stderr\": 0.029079374539480007\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.033851779760448106,\n \"\
acc_norm\": 0.5601851851851852,\n \"acc_norm_stderr\": 0.033851779760448106\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.02704462171947408,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.02704462171947408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944853,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944853\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7085201793721974,\n\
\ \"acc_stderr\": 0.030500283176545847,\n \"acc_norm\": 0.7085201793721974,\n\
\ \"acc_norm_stderr\": 0.030500283176545847\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709696,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709696\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092365,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8160919540229885,\n\
\ \"acc_stderr\": 0.01385372417092253,\n \"acc_norm\": 0.8160919540229885,\n\
\ \"acc_norm_stderr\": 0.01385372417092253\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7514450867052023,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.7514450867052023,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31843575418994413,\n\
\ \"acc_stderr\": 0.015581008080360278,\n \"acc_norm\": 0.31843575418994413,\n\
\ \"acc_norm_stderr\": 0.015581008080360278\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695815,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695815\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.025583062489984806,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.025583062489984806\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7685185185185185,\n \"acc_stderr\": 0.023468429832451135,\n\
\ \"acc_norm\": 0.7685185185185185,\n \"acc_norm_stderr\": 0.023468429832451135\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47522816166883963,\n\
\ \"acc_stderr\": 0.012754553719781752,\n \"acc_norm\": 0.47522816166883963,\n\
\ \"acc_norm_stderr\": 0.012754553719781752\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.027678468642144714,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.027678468642144714\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162662,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162662\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7551020408163265,\n \"acc_stderr\": 0.027529637440174923,\n\
\ \"acc_norm\": 0.7551020408163265,\n \"acc_norm_stderr\": 0.027529637440174923\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3598531211750306,\n\
\ \"mc1_stderr\": 0.01680186046667715,\n \"mc2\": 0.5205616644281855,\n\
\ \"mc2_stderr\": 0.014938012326386271\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8224151539068666,\n \"acc_stderr\": 0.010740676861359238\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245431\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/NEBULA-XB-v1.0_SFT_2_epoch
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|arc:challenge|25_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|gsm8k|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hellaswag|10_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T04-54-59.636887.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T04-54-59.636887.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- '**/details_harness|winogrande|5_2024-04-07T04-54-59.636887.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T04-54-59.636887.parquet'
- config_name: results
data_files:
- split: 2024_04_07T04_54_59.636887
path:
- results_2024-04-07T04-54-59.636887.parquet
- split: latest
path:
- results_2024-04-07T04-54-59.636887.parquet
---
# Dataset Card for Evaluation run of TeeZee/NEBULA-XB-v1.0_SFT_2_epoch
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/NEBULA-XB-v1.0_SFT_2_epoch](https://huggingface.co/TeeZee/NEBULA-XB-v1.0_SFT_2_epoch) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0_SFT_2_epoch",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T04:54:59.636887](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__NEBULA-XB-v1.0_SFT_2_epoch/blob/main/results_2024-04-07T04-54-59.636887.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6452862614843039,
"acc_stderr": 0.03162939989987327,
"acc_norm": 0.6570693455480727,
"acc_norm_stderr": 0.03246967905122735,
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5205616644281855,
"mc2_stderr": 0.014938012326386271
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097862,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.01410457836649189
},
"harness|hellaswag|10": {
"acc": 0.6573391754630552,
"acc_stderr": 0.004736292355716397,
"acc_norm": 0.8507269468233419,
"acc_norm_stderr": 0.003556291232050353
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.03567603799639172,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.03567603799639172
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5872340425531914,
"acc_stderr": 0.03218471141400351,
"acc_norm": 0.5872340425531914,
"acc_norm_stderr": 0.03218471141400351
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4576719576719577,
"acc_stderr": 0.02565886886205833,
"acc_norm": 0.4576719576719577,
"acc_norm_stderr": 0.02565886886205833
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7741935483870968,
"acc_stderr": 0.023785577884181012,
"acc_norm": 0.7741935483870968,
"acc_norm_stderr": 0.023785577884181012
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8121212121212121,
"acc_stderr": 0.03050193405942914,
"acc_norm": 0.8121212121212121,
"acc_norm_stderr": 0.03050193405942914
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8383838383838383,
"acc_stderr": 0.026225919863629283,
"acc_norm": 0.8383838383838383,
"acc_norm_stderr": 0.026225919863629283
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289708,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289708
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37407407407407406,
"acc_stderr": 0.029502861128955286,
"acc_norm": 0.37407407407407406,
"acc_norm_stderr": 0.029502861128955286
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7226890756302521,
"acc_stderr": 0.029079374539480007,
"acc_norm": 0.7226890756302521,
"acc_norm_stderr": 0.029079374539480007
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.033851779760448106,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.033851779760448106
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.02704462171947408,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.02704462171947408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944853,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944853
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7085201793721974,
"acc_stderr": 0.030500283176545847,
"acc_norm": 0.7085201793721974,
"acc_norm_stderr": 0.030500283176545847
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709696,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709696
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092365,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8160919540229885,
"acc_stderr": 0.01385372417092253,
"acc_norm": 0.8160919540229885,
"acc_norm_stderr": 0.01385372417092253
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7514450867052023,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.7514450867052023,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31843575418994413,
"acc_stderr": 0.015581008080360278,
"acc_norm": 0.31843575418994413,
"acc_norm_stderr": 0.015581008080360278
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695815,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695815
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.025583062489984806,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.025583062489984806
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.023468429832451135,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.023468429832451135
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47522816166883963,
"acc_stderr": 0.012754553719781752,
"acc_norm": 0.47522816166883963,
"acc_norm_stderr": 0.012754553719781752
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.027678468642144714,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.027678468642144714
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162662,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162662
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7551020408163265,
"acc_stderr": 0.027529637440174923,
"acc_norm": 0.7551020408163265,
"acc_norm_stderr": 0.027529637440174923
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3598531211750306,
"mc1_stderr": 0.01680186046667715,
"mc2": 0.5205616644281855,
"mc2_stderr": 0.014938012326386271
},
"harness|winogrande|5": {
"acc": 0.8224151539068666,
"acc_stderr": 0.010740676861359238
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245431
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0 | open-llm-leaderboard-old | "2024-04-07T05:03:43Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T05:03:24Z" | ---
pretty_name: Evaluation run of DUAL-GPO/zephyr-7b-ipo-qlora-v0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DUAL-GPO/zephyr-7b-ipo-qlora-v0](https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-qlora-v0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T05:01:01.232002](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0/blob/main/results_2024-04-07T05-01-01.232002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6336157909151666,\n\
\ \"acc_stderr\": 0.03259268627428965,\n \"acc_norm\": 0.6388963560492149,\n\
\ \"acc_norm_stderr\": 0.03325258805528349,\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4535332788498766,\n\
\ \"mc2_stderr\": 0.014550976512746065\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.0143610972884497,\n\
\ \"acc_norm\": 0.6313993174061433,\n \"acc_norm_stderr\": 0.0140978106780422\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6427006572395937,\n\
\ \"acc_stderr\": 0.004782246931195,\n \"acc_norm\": 0.8436566421031667,\n\
\ \"acc_norm_stderr\": 0.0036243831208234508\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.042446332383532265,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.042446332383532265\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.027943219989337142,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.027943219989337142\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7516129032258064,\n\
\ \"acc_stderr\": 0.024580028921481003,\n \"acc_norm\": 0.7516129032258064,\n\
\ \"acc_norm_stderr\": 0.024580028921481003\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6461538461538462,\n \"acc_stderr\": 0.024243783994062157,\n\
\ \"acc_norm\": 0.6461538461538462,\n \"acc_norm_stderr\": 0.024243783994062157\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028597,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028597\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135363,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135363\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.01633288239343139,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.01633288239343139\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.038968789850704164,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.038968789850704164\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.03157065078911901,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.03157065078911901\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.80970625798212,\n\
\ \"acc_stderr\": 0.014036945850381396,\n \"acc_norm\": 0.80970625798212,\n\
\ \"acc_norm_stderr\": 0.014036945850381396\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323378,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323378\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.40782122905027934,\n\
\ \"acc_stderr\": 0.016435865260914746,\n \"acc_norm\": 0.40782122905027934,\n\
\ \"acc_norm_stderr\": 0.016435865260914746\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7483660130718954,\n \"acc_stderr\": 0.024848018263875192,\n\
\ \"acc_norm\": 0.7483660130718954,\n \"acc_norm_stderr\": 0.024848018263875192\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7266881028938906,\n\
\ \"acc_stderr\": 0.025311765975426122,\n \"acc_norm\": 0.7266881028938906,\n\
\ \"acc_norm_stderr\": 0.025311765975426122\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.439374185136897,\n\
\ \"acc_stderr\": 0.012676014778580215,\n \"acc_norm\": 0.439374185136897,\n\
\ \"acc_norm_stderr\": 0.012676014778580215\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7022058823529411,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.7022058823529411,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128455,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8159203980099502,\n\
\ \"acc_stderr\": 0.02740385941078685,\n \"acc_norm\": 0.8159203980099502,\n\
\ \"acc_norm_stderr\": 0.02740385941078685\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.29865361077111385,\n\
\ \"mc1_stderr\": 0.016021570613768542,\n \"mc2\": 0.4535332788498766,\n\
\ \"mc2_stderr\": 0.014550976512746065\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7955801104972375,\n \"acc_stderr\": 0.011334090612597221\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.400303260045489,\n \
\ \"acc_stderr\": 0.013495926436566438\n }\n}\n```"
repo_url: https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-qlora-v0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|arc:challenge|25_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|gsm8k|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hellaswag|10_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T05-01-01.232002.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- '**/details_harness|winogrande|5_2024-04-07T05-01-01.232002.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T05-01-01.232002.parquet'
- config_name: results
data_files:
- split: 2024_04_07T05_01_01.232002
path:
- results_2024-04-07T05-01-01.232002.parquet
- split: latest
path:
- results_2024-04-07T05-01-01.232002.parquet
---
# Dataset Card for Evaluation run of DUAL-GPO/zephyr-7b-ipo-qlora-v0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DUAL-GPO/zephyr-7b-ipo-qlora-v0](https://huggingface.co/DUAL-GPO/zephyr-7b-ipo-qlora-v0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T05:01:01.232002](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-ipo-qlora-v0/blob/main/results_2024-04-07T05-01-01.232002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6336157909151666,
"acc_stderr": 0.03259268627428965,
"acc_norm": 0.6388963560492149,
"acc_norm_stderr": 0.03325258805528349,
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4535332788498766,
"mc2_stderr": 0.014550976512746065
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.0143610972884497,
"acc_norm": 0.6313993174061433,
"acc_norm_stderr": 0.0140978106780422
},
"harness|hellaswag|10": {
"acc": 0.6427006572395937,
"acc_stderr": 0.004782246931195,
"acc_norm": 0.8436566421031667,
"acc_norm_stderr": 0.0036243831208234508
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.042446332383532265,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.042446332383532265
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.027943219989337142,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.027943219989337142
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7516129032258064,
"acc_stderr": 0.024580028921481003,
"acc_norm": 0.7516129032258064,
"acc_norm_stderr": 0.024580028921481003
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6461538461538462,
"acc_stderr": 0.024243783994062157,
"acc_norm": 0.6461538461538462,
"acc_norm_stderr": 0.024243783994062157
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028597,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028597
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135363,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135363
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.01633288239343139,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.01633288239343139
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.038968789850704164,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.038968789850704164
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.03157065078911901,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.03157065078911901
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.80970625798212,
"acc_stderr": 0.014036945850381396,
"acc_norm": 0.80970625798212,
"acc_norm_stderr": 0.014036945850381396
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323378,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323378
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.40782122905027934,
"acc_stderr": 0.016435865260914746,
"acc_norm": 0.40782122905027934,
"acc_norm_stderr": 0.016435865260914746
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7483660130718954,
"acc_stderr": 0.024848018263875192,
"acc_norm": 0.7483660130718954,
"acc_norm_stderr": 0.024848018263875192
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7266881028938906,
"acc_stderr": 0.025311765975426122,
"acc_norm": 0.7266881028938906,
"acc_norm_stderr": 0.025311765975426122
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.439374185136897,
"acc_stderr": 0.012676014778580215,
"acc_norm": 0.439374185136897,
"acc_norm_stderr": 0.012676014778580215
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7022058823529411,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.7022058823529411,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128455,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8159203980099502,
"acc_stderr": 0.02740385941078685,
"acc_norm": 0.8159203980099502,
"acc_norm_stderr": 0.02740385941078685
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.29865361077111385,
"mc1_stderr": 0.016021570613768542,
"mc2": 0.4535332788498766,
"mc2_stderr": 0.014550976512746065
},
"harness|winogrande|5": {
"acc": 0.7955801104972375,
"acc_stderr": 0.011334090612597221
},
"harness|gsm8k|5": {
"acc": 0.400303260045489,
"acc_stderr": 0.013495926436566438
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mpasila/DarkViperAU-Essays | mpasila | "2024-04-11T09:07:51Z" | 0 | 0 | [
"language:en",
"size_categories:1K<n<10K",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-07T05:13:59Z" | ---
language:
- en
---
This is just the raw text extract of his essays. This will be cleaned up later. |
open-llm-leaderboard-old/details_DUAL-GPO__zephyr-7b-gpo-update3-i0 | open-llm-leaderboard-old | "2024-04-07T05:17:16Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T05:16:55Z" | ---
pretty_name: Evaluation run of DUAL-GPO/zephyr-7b-gpo-update3-i0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [DUAL-GPO/zephyr-7b-gpo-update3-i0](https://huggingface.co/DUAL-GPO/zephyr-7b-gpo-update3-i0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-gpo-update3-i0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T05:14:35.819805](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-gpo-update3-i0/blob/main/results_2024-04-07T05-14-35.819805.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.623681988153887,\n\
\ \"acc_stderr\": 0.03274796696625993,\n \"acc_norm\": 0.6292879495251853,\n\
\ \"acc_norm_stderr\": 0.03341845200058962,\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347032,\n \"mc2\": 0.5185161412200289,\n\
\ \"mc2_stderr\": 0.014965955021645876\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6109215017064846,\n \"acc_stderr\": 0.014247309976045607,\n\
\ \"acc_norm\": 0.6518771331058021,\n \"acc_norm_stderr\": 0.013921008595179344\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.661521609241187,\n\
\ \"acc_stderr\": 0.004722250355106684,\n \"acc_norm\": 0.8537143995220076,\n\
\ \"acc_norm_stderr\": 0.0035267007418794435\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.03894734487013317,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.03894734487013317\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798335,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798335\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6358381502890174,\n\
\ \"acc_stderr\": 0.03669072477416907,\n \"acc_norm\": 0.6358381502890174,\n\
\ \"acc_norm_stderr\": 0.03669072477416907\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082635,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082635\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932261,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932261\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.032650194750335815,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.032650194750335815\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6068965517241379,\n \"acc_stderr\": 0.0407032901370707,\n\
\ \"acc_norm\": 0.6068965517241379,\n \"acc_norm_stderr\": 0.0407032901370707\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778415,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778415\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7322580645161291,\n\
\ \"acc_stderr\": 0.025189006660212385,\n \"acc_norm\": 0.7322580645161291,\n\
\ \"acc_norm_stderr\": 0.025189006660212385\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4433497536945813,\n \"acc_stderr\": 0.03495334582162934,\n\
\ \"acc_norm\": 0.4433497536945813,\n \"acc_norm_stderr\": 0.03495334582162934\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479049,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479049\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.02541634309630643,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.02541634309630643\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478923,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478923\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8055045871559633,\n \"acc_stderr\": 0.016970289090458033,\n \"\
acc_norm\": 0.8055045871559633,\n \"acc_norm_stderr\": 0.016970289090458033\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.041331194402438376,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.041331194402438376\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.0239023255495604,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.0239023255495604\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7052023121387283,\n \"acc_stderr\": 0.024547617794803828,\n\
\ \"acc_norm\": 0.7052023121387283,\n \"acc_norm_stderr\": 0.024547617794803828\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.01649540063582008,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.01649540063582008\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.0256468630971379,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.0256468630971379\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.02888819310398863,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.02888819310398863\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.028996909693328923,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.028996909693328923\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774708,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774708\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3427172582619339,\n\
\ \"mc1_stderr\": 0.016614949385347032,\n \"mc2\": 0.5185161412200289,\n\
\ \"mc2_stderr\": 0.014965955021645876\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7987371744277821,\n \"acc_stderr\": 0.01126851997157768\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3510235026535254,\n \
\ \"acc_stderr\": 0.013146945941397217\n }\n}\n```"
repo_url: https://huggingface.co/DUAL-GPO/zephyr-7b-gpo-update3-i0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|arc:challenge|25_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|gsm8k|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hellaswag|10_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-14-35.819805.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T05-14-35.819805.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- '**/details_harness|winogrande|5_2024-04-07T05-14-35.819805.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T05-14-35.819805.parquet'
- config_name: results
data_files:
- split: 2024_04_07T05_14_35.819805
path:
- results_2024-04-07T05-14-35.819805.parquet
- split: latest
path:
- results_2024-04-07T05-14-35.819805.parquet
---
# Dataset Card for Evaluation run of DUAL-GPO/zephyr-7b-gpo-update3-i0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [DUAL-GPO/zephyr-7b-gpo-update3-i0](https://huggingface.co/DUAL-GPO/zephyr-7b-gpo-update3-i0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-gpo-update3-i0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T05:14:35.819805](https://huggingface.co/datasets/open-llm-leaderboard/details_DUAL-GPO__zephyr-7b-gpo-update3-i0/blob/main/results_2024-04-07T05-14-35.819805.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.623681988153887,
"acc_stderr": 0.03274796696625993,
"acc_norm": 0.6292879495251853,
"acc_norm_stderr": 0.03341845200058962,
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347032,
"mc2": 0.5185161412200289,
"mc2_stderr": 0.014965955021645876
},
"harness|arc:challenge|25": {
"acc": 0.6109215017064846,
"acc_stderr": 0.014247309976045607,
"acc_norm": 0.6518771331058021,
"acc_norm_stderr": 0.013921008595179344
},
"harness|hellaswag|10": {
"acc": 0.661521609241187,
"acc_stderr": 0.004722250355106684,
"acc_norm": 0.8537143995220076,
"acc_norm_stderr": 0.0035267007418794435
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.03894734487013317,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.03894734487013317
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798335,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798335
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.03669072477416907,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.03669072477416907
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082635,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082635
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932261,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932261
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.032650194750335815,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.032650194750335815
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6068965517241379,
"acc_stderr": 0.0407032901370707,
"acc_norm": 0.6068965517241379,
"acc_norm_stderr": 0.0407032901370707
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778415,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778415
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7322580645161291,
"acc_stderr": 0.025189006660212385,
"acc_norm": 0.7322580645161291,
"acc_norm_stderr": 0.025189006660212385
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4433497536945813,
"acc_stderr": 0.03495334582162934,
"acc_norm": 0.4433497536945813,
"acc_norm_stderr": 0.03495334582162934
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479049,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479049
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.02541634309630643,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.02541634309630643
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478923,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478923
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8055045871559633,
"acc_stderr": 0.016970289090458033,
"acc_norm": 0.8055045871559633,
"acc_norm_stderr": 0.016970289090458033
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.041331194402438376,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.041331194402438376
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.0239023255495604,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.0239023255495604
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.024547617794803828,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.024547617794803828
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.01649540063582008,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.01649540063582008
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.0256468630971379,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.0256468630971379
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.01268781841959992,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.01268781841959992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.02888819310398863,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.02888819310398863
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.028996909693328923,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.028996909693328923
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774708,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774708
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3427172582619339,
"mc1_stderr": 0.016614949385347032,
"mc2": 0.5185161412200289,
"mc2_stderr": 0.014965955021645876
},
"harness|winogrande|5": {
"acc": 0.7987371744277821,
"acc_stderr": 0.01126851997157768
},
"harness|gsm8k|5": {
"acc": 0.3510235026535254,
"acc_stderr": 0.013146945941397217
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
existence-master/bloomify-classification-0.6k-simple | existence-master | "2024-04-10T08:56:56Z" | 0 | 0 | [
"license:gpl-3.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T05:17:52Z" | ---
license: gpl-3.0
---
|
Jay-Rajput/DIS_IPL_Outcomes | Jay-Rajput | "2024-05-30T06:51:40Z" | 0 | 0 | [
"size_categories:n<1K",
"modality:text",
"region:us"
] | null | "2024-04-07T05:17:58Z" | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: outcomes
data_files: outcomes/*.json
dataset_info:
features:
- name: match_id
dtype: string
- name: man_of_the_match
dtype: string
- name: winning_team
dtype: string
splits:
- name: train
num_bytes: 2049
num_examples: 53
download_size: 2929
dataset_size: 2049
---
---
license: apache-2.0
---
|
AnushaKulkarni/preferred_dataset2 | AnushaKulkarni | "2024-04-07T21:15:44Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T06:16:24Z" | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
dtype: string
- name: rejected
dtype: string
splits:
- name: train
num_bytes: 34117
num_examples: 50
download_size: 27043
dataset_size: 34117
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
wookyungseo/customllamacode-kor | wookyungseo | "2024-04-09T00:47:32Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T06:20:08Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 7786
num_examples: 32
download_size: 4172
dataset_size: 7786
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sdansdk/processed_meta_review | sdansdk | "2024-04-07T17:11:04Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T06:34:48Z" | ---
dataset_info:
features:
- name: Input
dtype: string
- name: Output
dtype: string
splits:
- name: train
num_bytes: 81617052
num_examples: 7680
- name: validation
num_bytes: 17524553
num_examples: 1645
- name: test
num_bytes: 17471237
num_examples: 1645
download_size: 58593680
dataset_size: 116612842
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
marsggbo/sst2_10000_mrpc_2000_MixtralMoE_patterns | marsggbo | "2024-04-07T06:41:12Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T06:40:37Z" | ---
dataset_info:
features:
- name: source
dtype: string
- name: prompt_len
dtype: int64
- name: token_idx
sequence: int64
- name: token_expert_patterns
sequence:
sequence:
sequence: int64
- name: sentence_expert_pattern
sequence:
sequence: int64
splits:
- name: train
num_bytes: 2758651736
num_examples: 12000
download_size: 35121925
dataset_size: 2758651736
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sdansdk/tokenized_meta_review | sdansdk | "2024-04-07T17:15:19Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T06:47:52Z" | ---
dataset_info:
features:
- name: Input
dtype: string
- name: Output
dtype: string
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 248586611
num_examples: 7680
- name: validation
num_bytes: 53401140
num_examples: 1645
- name: test
num_bytes: 53175067
num_examples: 1645
download_size: 127775545
dataset_size: 355162818
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Maximofn/opus100 | Maximofn | "2024-06-09T04:52:07Z" | 0 | 0 | [
"task_categories:translation",
"language:en",
"language:es",
"license:apache-2.0",
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"translation"
] | "2024-04-07T06:51:37Z" | ---
license: apache-2.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
- split: train
path: data/train-*
- split: validation
path: data/validation-*
dataset_info:
features:
- name: translation
dtype:
translation:
languages:
- en
- es
splits:
- name: test
num_bytes: 326262
num_examples: 2000
- name: train
num_bytes: 136643104
num_examples: 1000000
- name: validation
num_bytes: 326727
num_examples: 2000
download_size: 100103904
dataset_size: 137296093
task_categories:
- translation
language:
- en
- es
pretty_name: Opus100 EN-ES
size_categories:
- 100M<n<1B
---
# Dataset Card for "Opus100 EN-ES"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
mpasila/DarkViperAU-QA-ChatML | mpasila | "2024-04-11T09:10:09Z" | 0 | 0 | [
"language:en",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T08:01:22Z" | ---
language:
- en
---
This is unfinished, only meant for testing. |
open-llm-leaderboard-old/details_Locutusque__OpenCerebrum-1.5-Mistral-7B-v0.2-beta | open-llm-leaderboard-old | "2024-04-07T17:53:26Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T08:25:28Z" | ---
pretty_name: Evaluation run of Locutusque/OpenCerebrum-1.5-Mistral-7B-v0.2-beta
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/OpenCerebrum-1.5-Mistral-7B-v0.2-beta](https://huggingface.co/Locutusque/OpenCerebrum-1.5-Mistral-7B-v0.2-beta)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.5-Mistral-7B-v0.2-beta\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T17:50:41.215761](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.5-Mistral-7B-v0.2-beta/blob/main/results_2024-04-07T17-50-41.215761.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6314331340791337,\n\
\ \"acc_stderr\": 0.0325188255122189,\n \"acc_norm\": 0.6366516060652345,\n\
\ \"acc_norm_stderr\": 0.033174336324203274,\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4543367097746932,\n\
\ \"mc2_stderr\": 0.014562184308244347\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.01447800569418253,\n\
\ \"acc_norm\": 0.5998293515358362,\n \"acc_norm_stderr\": 0.014317197787809174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6331408086038638,\n\
\ \"acc_stderr\": 0.004809626723626825,\n \"acc_norm\": 0.8350926110336586,\n\
\ \"acc_norm_stderr\": 0.0037033852685121743\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5531914893617021,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.02525303255499769,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.02525303255499769\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139403,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139403\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.02381447708659355,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.02381447708659355\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.617948717948718,\n \"acc_stderr\": 0.024635549163908234,\n \
\ \"acc_norm\": 0.617948717948718,\n \"acc_norm_stderr\": 0.024635549163908234\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6302521008403361,\n \"acc_stderr\": 0.03135709599613591,\n \
\ \"acc_norm\": 0.6302521008403361,\n \"acc_norm_stderr\": 0.03135709599613591\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266868,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266868\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7990196078431373,\n \"acc_stderr\": 0.02812597226565438,\n \"\
acc_norm\": 0.7990196078431373,\n \"acc_norm_stderr\": 0.02812597226565438\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229962,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229962\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.672645739910314,\n\
\ \"acc_stderr\": 0.03149384670994131,\n \"acc_norm\": 0.672645739910314,\n\
\ \"acc_norm_stderr\": 0.03149384670994131\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5625,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.5625,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n\
\ \"acc_stderr\": 0.014351702181636863,\n \"acc_norm\": 0.7982120051085568,\n\
\ \"acc_norm_stderr\": 0.014351702181636863\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37318435754189944,\n\
\ \"acc_stderr\": 0.01617569201338196,\n \"acc_norm\": 0.37318435754189944,\n\
\ \"acc_norm_stderr\": 0.01617569201338196\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667874,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667874\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900926,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900926\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4511082138200782,\n\
\ \"acc_stderr\": 0.012709037347346233,\n \"acc_norm\": 0.4511082138200782,\n\
\ \"acc_norm_stderr\": 0.012709037347346233\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462927,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462927\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7224489795918367,\n \"acc_stderr\": 0.02866685779027465,\n\
\ \"acc_norm\": 0.7224489795918367,\n \"acc_norm_stderr\": 0.02866685779027465\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827075,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827075\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835816,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835816\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.30599755201958384,\n\
\ \"mc1_stderr\": 0.016132229728155045,\n \"mc2\": 0.4543367097746932,\n\
\ \"mc2_stderr\": 0.014562184308244347\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7837411207576953,\n \"acc_stderr\": 0.011570614861409347\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40561031084154664,\n \
\ \"acc_stderr\": 0.01352484889446211\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/OpenCerebrum-1.5-Mistral-7B-v0.2-beta
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|arc:challenge|25_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|arc:challenge|25_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|gsm8k|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|gsm8k|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hellaswag|10_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hellaswag|10_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T08-23-12.643556.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-50-41.215761.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T17-50-41.215761.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- '**/details_harness|winogrande|5_2024-04-07T08-23-12.643556.parquet'
- split: 2024_04_07T17_50_41.215761
path:
- '**/details_harness|winogrande|5_2024-04-07T17-50-41.215761.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T17-50-41.215761.parquet'
- config_name: results
data_files:
- split: 2024_04_07T08_23_12.643556
path:
- results_2024-04-07T08-23-12.643556.parquet
- split: 2024_04_07T17_50_41.215761
path:
- results_2024-04-07T17-50-41.215761.parquet
- split: latest
path:
- results_2024-04-07T17-50-41.215761.parquet
---
# Dataset Card for Evaluation run of Locutusque/OpenCerebrum-1.5-Mistral-7B-v0.2-beta
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/OpenCerebrum-1.5-Mistral-7B-v0.2-beta](https://huggingface.co/Locutusque/OpenCerebrum-1.5-Mistral-7B-v0.2-beta) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.5-Mistral-7B-v0.2-beta",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T17:50:41.215761](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__OpenCerebrum-1.5-Mistral-7B-v0.2-beta/blob/main/results_2024-04-07T17-50-41.215761.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6314331340791337,
"acc_stderr": 0.0325188255122189,
"acc_norm": 0.6366516060652345,
"acc_norm_stderr": 0.033174336324203274,
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4543367097746932,
"mc2_stderr": 0.014562184308244347
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.01447800569418253,
"acc_norm": 0.5998293515358362,
"acc_norm_stderr": 0.014317197787809174
},
"harness|hellaswag|10": {
"acc": 0.6331408086038638,
"acc_stderr": 0.004809626723626825,
"acc_norm": 0.8350926110336586,
"acc_norm_stderr": 0.0037033852685121743
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.02525303255499769,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.02525303255499769
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139403,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139403
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.02381447708659355,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.02381447708659355
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.617948717948718,
"acc_stderr": 0.024635549163908234,
"acc_norm": 0.617948717948718,
"acc_norm_stderr": 0.024635549163908234
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6302521008403361,
"acc_stderr": 0.03135709599613591,
"acc_norm": 0.6302521008403361,
"acc_norm_stderr": 0.03135709599613591
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266868,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266868
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7990196078431373,
"acc_stderr": 0.02812597226565438,
"acc_norm": 0.7990196078431373,
"acc_norm_stderr": 0.02812597226565438
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229962,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229962
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.672645739910314,
"acc_stderr": 0.03149384670994131,
"acc_norm": 0.672645739910314,
"acc_norm_stderr": 0.03149384670994131
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5625,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636863,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636863
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37318435754189944,
"acc_stderr": 0.01617569201338196,
"acc_norm": 0.37318435754189944,
"acc_norm_stderr": 0.01617569201338196
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667874,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667874
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900926,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900926
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4511082138200782,
"acc_stderr": 0.012709037347346233,
"acc_norm": 0.4511082138200782,
"acc_norm_stderr": 0.012709037347346233
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462927,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462927
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7224489795918367,
"acc_stderr": 0.02866685779027465,
"acc_norm": 0.7224489795918367,
"acc_norm_stderr": 0.02866685779027465
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827075,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827075
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835816,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835816
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.30599755201958384,
"mc1_stderr": 0.016132229728155045,
"mc2": 0.4543367097746932,
"mc2_stderr": 0.014562184308244347
},
"harness|winogrande|5": {
"acc": 0.7837411207576953,
"acc_stderr": 0.011570614861409347
},
"harness|gsm8k|5": {
"acc": 0.40561031084154664,
"acc_stderr": 0.01352484889446211
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Smuggling1710/vERPv2 | Smuggling1710 | "2024-04-08T03:16:04Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-07T09:00:55Z" | ---
license: apache-2.0
---
|
ChocolateBlack/Ishiu | ChocolateBlack | "2024-04-07T09:17:58Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T09:17:47Z" | ---
license: apache-2.0
---
|
Gelid/Placeholder_name | Gelid | "2024-04-07T09:57:45Z" | 0 | 0 | [
"language:en",
"license:cc-by-nc-sa-4.0",
"region:us"
] | null | "2024-04-07T09:53:32Z" | ---
license: cc-by-nc-sa-4.0
language:
- en
---
# Dataset Card for Dataset Name
<!-- Provide a quick summary of the dataset. -->
This dataset card aims to be a base template for new datasets. It has been generated using [this raw template](https://github.com/huggingface/huggingface_hub/blob/main/src/huggingface_hub/templates/datasetcard_template.md?plain=1).
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
chaoscodes/refinedweb-500 | chaoscodes | "2024-04-07T11:01:28Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T10:27:33Z" | ---
license: apache-2.0
---
|
fegounna/GMP_long | fegounna | "2024-04-07T10:49:19Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T10:48:58Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 931485079
num_examples: 409032
download_size: 384683488
dataset_size: 931485079
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Seeker38/image_text_wikipedia_vi | Seeker38 | "2024-04-10T13:08:49Z" | 0 | 0 | [
"source_datasets:original",
"language:vi",
"size_categories:100K<n<1M",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"wikipedia",
"images",
"text",
"LM"
] | null | "2024-04-07T11:18:01Z" | ---
language:
- vi
pretty_name: Images and corresponding abstracts in Vietnamese Wikipedia
source_datasets:
- original
size_categories:
- 100K<n<1M
tags:
- wikipedia
- images
- text
- LM
dataset_info:
features:
- name: image
dtype: image
- name: title
dtype: string
- name: text
dtype: string
# splits:
# - name: train
---
# Dataset Card for image_text_wikipedia_vi
### Dataset Summary
Dataset Summary: Image-Text Wikipedia Abstracts (Vietnamese version) <br>
This dataset comprises nearly 380.000 pairs of images and corresponding textual abstracts extracted from Vietnamese Wikipedia articles. The dataset is designed to facilitate research and development in the field of multimodal learning, particularly in tasks that involve understanding and processing both textual and visual information.
Description:
- Total Images: 374748
- Total Textual Abstracts: 374748
Dataset Composition:
- Each entry in the dataset consists of an image along with the corresponding abstract text extracted from the introductory section of Vietnamese Wikipedia articles.<br>
- The images are diverse in content, ranging from objects and scenes to landmarks and people, providing a rich and varied set of visual information.
### Data Collection:
The dataset was curated by combining 2 methods:
- Extracting and filtering abstracts text directly from XML Wikimedia dump file.
- Scraping Vietnamese Wikipedia articles, focusing on the introductory paragraphs known as abstracts. These abstracts serve as concise summaries of the corresponding articles, providing context and key information related to the image.
### Intended Use:
Researchers and developers can utilize this dataset for various tasks such as:
- Multimodal learning: Training models to understand and generate descriptions for both images and text.
- Image captioning: Generating descriptive captions for images.
- Visual question answering (VQA): Developing models that can answer questions about visual content.
- Cross-modal retrieval: Matching images to their corresponding textual abstracts and vice versa.
### Data Preprocessing:
- Image Format: The images are provided in a standardized JPG format.
- Text Preprocessing: The textual abstracts have undergone basic preprocessing steps such as removal of unnecessary brackets which are mainly use in XML, removal of unknown character such as: '\u00A0', removal of the tagging of comment: [1],[2],[3],..., removal of unnecessary empty lines inside each text,....
### Potential Challenges:
- Language Complexity: As abstracts are extracted from Wikipedia, the text might include complex vocabulary and diverse topics.
- Ambiguity: Some abstracts may contain ambiguous or figurative language, challenging comprehension.
- Image Quality: Variation in image quality and resolution may impact model performance.
- Text length imbalance: the longest text has the length of 8903 whereas the shortest is 1. This can create a situation of highly ram usage with using LSTM model,etc..
### View dataset:
There are 2 ways to load dataset:
<b>1. Use datasets library instead of downloading the dataset to local</b>
```python
from datasets import load_dataset
dataset = load_dataset("Seeker38/image_text_wikipedia_vi", split="train")
```
##### you can use the link from this <b>[Google Colab](https://colab.research.google.com/drive/1BOAEsiVXNGm__vhZ4v_oyqytweG3JTm_?usp=sharing)</b> to see a little viewing demo.
<b>2. For dataset that has been downloaded to local</b>
```python
import pandas as pd
from datasets import Dataset
parquet_file = 'articles_data.parquet'
df = pd.read_parquet(parquet_file)
# Convert the pandas DataFrame to a datasets.arrow_dataset.Dataset object
dataset = Dataset.from_pandas(df)
```
<b>To view the element's text</b>
```python
# Example: element number 3
dataset[3]["text"]
```
<b>If you use the 2nd way, then to view,or even use for training the element's image, you need to contain the convertion step</b>
```python
from PIL import Image
import io
# Example: element number 3
image_bytes = dataset[3]["image"]["bytes"]
# Convert bytes to Image
image = Image.open(io.BytesIO(image_bytes))
image_rgb = image.convert("RGB") # some images have error: ValueError: Could not save to JPEG for display
image_rgb
```
<b>Else</b>
```python
dataset[2]["image"]
``` |
xlangai/ubuntu_x86 | xlangai | "2024-07-18T05:35:13Z" | 0 | 2 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-07T11:30:42Z" | ---
license: apache-2.0
---
|
ChavyvAkvar/chai-reward-kto-trainer-v2 | ChavyvAkvar | "2024-04-07T15:27:18Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T13:14:46Z" | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 21896726
num_examples: 10735
download_size: 12273033
dataset_size: 21896726
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Kenneth12342/llm_prompt_tuning | Kenneth12342 | "2024-04-07T13:17:35Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T13:17:29Z" | ---
dataset_info:
features:
- name: Sequence Number
dtype: int64
- name: input
dtype: string
- name: Label
dtype: int64
splits:
- name: train
num_bytes: 642859
num_examples: 800
- name: validation
num_bytes: 63568
num_examples: 100
- name: test
num_bytes: 76958
num_examples: 100
download_size: 516756
dataset_size: 783385
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_speakleash__Bielik-7B-v0.1 | open-llm-leaderboard-old | "2024-04-09T23:37:53Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T14:00:37Z" | ---
pretty_name: Evaluation run of speakleash/Bielik-7B-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [speakleash/Bielik-7B-v0.1](https://huggingface.co/speakleash/Bielik-7B-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T23:35:06.556889](https://huggingface.co/datasets/open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1/blob/main/results_2024-04-09T23-35-06.556889.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4716293062756141,\n\
\ \"acc_stderr\": 0.03461349827241483,\n \"acc_norm\": 0.47480802187603194,\n\
\ \"acc_norm_stderr\": 0.035346442062677605,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4320429372007698,\n\
\ \"mc2_stderr\": 0.014925535179229217\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.41638225255972694,\n \"acc_stderr\": 0.014405618279436176,\n\
\ \"acc_norm\": 0.4522184300341297,\n \"acc_norm_stderr\": 0.014544519880633832\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5073690499900418,\n\
\ \"acc_stderr\": 0.004989239462835232,\n \"acc_norm\": 0.6792471619199363,\n\
\ \"acc_norm_stderr\": 0.004658120152230819\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4605263157894737,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.4605263157894737,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.47547169811320755,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.47547169811320755,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04148415739394154,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04148415739394154\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4624277456647399,\n\
\ \"acc_stderr\": 0.0380168510452446,\n \"acc_norm\": 0.4624277456647399,\n\
\ \"acc_norm_stderr\": 0.0380168510452446\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808778,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808778\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n\
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.40425531914893614,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481425,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481425\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4206896551724138,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.4206896551724138,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3386243386243386,\n \"acc_stderr\": 0.024373197867983063,\n \"\
acc_norm\": 0.3386243386243386,\n \"acc_norm_stderr\": 0.024373197867983063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5516129032258065,\n\
\ \"acc_stderr\": 0.02829205683011274,\n \"acc_norm\": 0.5516129032258065,\n\
\ \"acc_norm_stderr\": 0.02829205683011274\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.037818873532059816,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.037818873532059816\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5909090909090909,\n \"acc_stderr\": 0.03502975799413007,\n \"\
acc_norm\": 0.5909090909090909,\n \"acc_norm_stderr\": 0.03502975799413007\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.4205128205128205,\n \"acc_stderr\": 0.02502861027671086,\n \
\ \"acc_norm\": 0.4205128205128205,\n \"acc_norm_stderr\": 0.02502861027671086\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2814814814814815,\n \"acc_stderr\": 0.027420019350945284,\n \
\ \"acc_norm\": 0.2814814814814815,\n \"acc_norm_stderr\": 0.027420019350945284\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3949579831932773,\n \"acc_stderr\": 0.031753678460966245,\n\
\ \"acc_norm\": 0.3949579831932773,\n \"acc_norm_stderr\": 0.031753678460966245\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6036697247706422,\n \"acc_stderr\": 0.02097146994790053,\n \"\
acc_norm\": 0.6036697247706422,\n \"acc_norm_stderr\": 0.02097146994790053\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2824074074074074,\n \"acc_stderr\": 0.030701372111510923,\n \"\
acc_norm\": 0.2824074074074074,\n \"acc_norm_stderr\": 0.030701372111510923\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.5784313725490197,\n \"acc_stderr\": 0.034658681963807614,\n \"\
acc_norm\": 0.5784313725490197,\n \"acc_norm_stderr\": 0.034658681963807614\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6540084388185654,\n \"acc_stderr\": 0.03096481058878671,\n \
\ \"acc_norm\": 0.6540084388185654,\n \"acc_norm_stderr\": 0.03096481058878671\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5426008968609866,\n\
\ \"acc_stderr\": 0.033435777055830646,\n \"acc_norm\": 0.5426008968609866,\n\
\ \"acc_norm_stderr\": 0.033435777055830646\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6115702479338843,\n \"acc_stderr\": 0.04449270350068382,\n \"\
acc_norm\": 0.6115702479338843,\n \"acc_norm_stderr\": 0.04449270350068382\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5644171779141104,\n \"acc_stderr\": 0.03895632464138937,\n\
\ \"acc_norm\": 0.5644171779141104,\n \"acc_norm_stderr\": 0.03895632464138937\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.38392857142857145,\n\
\ \"acc_stderr\": 0.04616143075028547,\n \"acc_norm\": 0.38392857142857145,\n\
\ \"acc_norm_stderr\": 0.04616143075028547\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6116504854368932,\n \"acc_stderr\": 0.048257293373563895,\n\
\ \"acc_norm\": 0.6116504854368932,\n \"acc_norm_stderr\": 0.048257293373563895\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7094017094017094,\n\
\ \"acc_stderr\": 0.029745048572674047,\n \"acc_norm\": 0.7094017094017094,\n\
\ \"acc_norm_stderr\": 0.029745048572674047\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6360153256704981,\n\
\ \"acc_stderr\": 0.017205684809032232,\n \"acc_norm\": 0.6360153256704981,\n\
\ \"acc_norm_stderr\": 0.017205684809032232\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4913294797687861,\n \"acc_stderr\": 0.0269150473553698,\n\
\ \"acc_norm\": 0.4913294797687861,\n \"acc_norm_stderr\": 0.0269150473553698\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2748603351955307,\n\
\ \"acc_stderr\": 0.014931316703220503,\n \"acc_norm\": 0.2748603351955307,\n\
\ \"acc_norm_stderr\": 0.014931316703220503\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.028555827516528784,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.028555827516528784\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5305466237942122,\n\
\ \"acc_stderr\": 0.028345045864840625,\n \"acc_norm\": 0.5305466237942122,\n\
\ \"acc_norm_stderr\": 0.028345045864840625\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.02782074420373286,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.02782074420373286\n },\n\
\ \"harness|hendrycksTest-professional_accounting|5\": {\n \"acc\": 0.3262411347517731,\n\
\ \"acc_stderr\": 0.027968453043563168,\n \"acc_norm\": 0.3262411347517731,\n\
\ \"acc_norm_stderr\": 0.027968453043563168\n },\n \"harness|hendrycksTest-professional_law|5\"\
: {\n \"acc\": 0.3376792698826597,\n \"acc_stderr\": 0.012078563777145552,\n\
\ \"acc_norm\": 0.3376792698826597,\n \"acc_norm_stderr\": 0.012078563777145552\n\
\ },\n \"harness|hendrycksTest-professional_medicine|5\": {\n \"acc\"\
: 0.4338235294117647,\n \"acc_stderr\": 0.030105636570016636,\n \"\
acc_norm\": 0.4338235294117647,\n \"acc_norm_stderr\": 0.030105636570016636\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.42810457516339867,\n \"acc_stderr\": 0.020017629214213108,\n \
\ \"acc_norm\": 0.42810457516339867,\n \"acc_norm_stderr\": 0.020017629214213108\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972744,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972744\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5183673469387755,\n \"acc_stderr\": 0.03198761546763127,\n\
\ \"acc_norm\": 0.5183673469387755,\n \"acc_norm_stderr\": 0.03198761546763127\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.681592039800995,\n\
\ \"acc_stderr\": 0.03294118479054095,\n \"acc_norm\": 0.681592039800995,\n\
\ \"acc_norm_stderr\": 0.03294118479054095\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.41566265060240964,\n\
\ \"acc_stderr\": 0.038367221765980515,\n \"acc_norm\": 0.41566265060240964,\n\
\ \"acc_norm_stderr\": 0.038367221765980515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4320429372007698,\n\
\ \"mc2_stderr\": 0.014925535179229217\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6685082872928176,\n \"acc_stderr\": 0.01323039719896465\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.29492039423805916,\n \
\ \"acc_stderr\": 0.012560698010954767\n }\n}\n```"
repo_url: https://huggingface.co/speakleash/Bielik-7B-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|arc:challenge|25_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|gsm8k|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hellaswag|10_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T13-58-19.064215.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T23-35-06.556889.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- '**/details_harness|winogrande|5_2024-04-07T13-58-19.064215.parquet'
- split: 2024_04_09T23_35_06.556889
path:
- '**/details_harness|winogrande|5_2024-04-09T23-35-06.556889.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T23-35-06.556889.parquet'
- config_name: results
data_files:
- split: 2024_04_07T13_58_19.064215
path:
- results_2024-04-07T13-58-19.064215.parquet
- split: 2024_04_09T23_35_06.556889
path:
- results_2024-04-09T23-35-06.556889.parquet
- split: latest
path:
- results_2024-04-09T23-35-06.556889.parquet
---
# Dataset Card for Evaluation run of speakleash/Bielik-7B-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [speakleash/Bielik-7B-v0.1](https://huggingface.co/speakleash/Bielik-7B-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T23:35:06.556889](https://huggingface.co/datasets/open-llm-leaderboard/details_speakleash__Bielik-7B-v0.1/blob/main/results_2024-04-09T23-35-06.556889.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4716293062756141,
"acc_stderr": 0.03461349827241483,
"acc_norm": 0.47480802187603194,
"acc_norm_stderr": 0.035346442062677605,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4320429372007698,
"mc2_stderr": 0.014925535179229217
},
"harness|arc:challenge|25": {
"acc": 0.41638225255972694,
"acc_stderr": 0.014405618279436176,
"acc_norm": 0.4522184300341297,
"acc_norm_stderr": 0.014544519880633832
},
"harness|hellaswag|10": {
"acc": 0.5073690499900418,
"acc_stderr": 0.004989239462835232,
"acc_norm": 0.6792471619199363,
"acc_norm_stderr": 0.004658120152230819
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4605263157894737,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.4605263157894737,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.47547169811320755,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.47547169811320755,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4375,
"acc_stderr": 0.04148415739394154,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04148415739394154
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.0380168510452446,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.0380168510452446
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808778,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808778
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481425,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481425
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4206896551724138,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.4206896551724138,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3386243386243386,
"acc_stderr": 0.024373197867983063,
"acc_norm": 0.3386243386243386,
"acc_norm_stderr": 0.024373197867983063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5516129032258065,
"acc_stderr": 0.02829205683011274,
"acc_norm": 0.5516129032258065,
"acc_norm_stderr": 0.02829205683011274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.037818873532059816,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.037818873532059816
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5909090909090909,
"acc_stderr": 0.03502975799413007,
"acc_norm": 0.5909090909090909,
"acc_norm_stderr": 0.03502975799413007
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.4205128205128205,
"acc_stderr": 0.02502861027671086,
"acc_norm": 0.4205128205128205,
"acc_norm_stderr": 0.02502861027671086
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2814814814814815,
"acc_stderr": 0.027420019350945284,
"acc_norm": 0.2814814814814815,
"acc_norm_stderr": 0.027420019350945284
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3949579831932773,
"acc_stderr": 0.031753678460966245,
"acc_norm": 0.3949579831932773,
"acc_norm_stderr": 0.031753678460966245
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6036697247706422,
"acc_stderr": 0.02097146994790053,
"acc_norm": 0.6036697247706422,
"acc_norm_stderr": 0.02097146994790053
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2824074074074074,
"acc_stderr": 0.030701372111510923,
"acc_norm": 0.2824074074074074,
"acc_norm_stderr": 0.030701372111510923
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5784313725490197,
"acc_stderr": 0.034658681963807614,
"acc_norm": 0.5784313725490197,
"acc_norm_stderr": 0.034658681963807614
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6540084388185654,
"acc_stderr": 0.03096481058878671,
"acc_norm": 0.6540084388185654,
"acc_norm_stderr": 0.03096481058878671
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5426008968609866,
"acc_stderr": 0.033435777055830646,
"acc_norm": 0.5426008968609866,
"acc_norm_stderr": 0.033435777055830646
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6115702479338843,
"acc_stderr": 0.04449270350068382,
"acc_norm": 0.6115702479338843,
"acc_norm_stderr": 0.04449270350068382
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5644171779141104,
"acc_stderr": 0.03895632464138937,
"acc_norm": 0.5644171779141104,
"acc_norm_stderr": 0.03895632464138937
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.38392857142857145,
"acc_stderr": 0.04616143075028547,
"acc_norm": 0.38392857142857145,
"acc_norm_stderr": 0.04616143075028547
},
"harness|hendrycksTest-management|5": {
"acc": 0.6116504854368932,
"acc_stderr": 0.048257293373563895,
"acc_norm": 0.6116504854368932,
"acc_norm_stderr": 0.048257293373563895
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7094017094017094,
"acc_stderr": 0.029745048572674047,
"acc_norm": 0.7094017094017094,
"acc_norm_stderr": 0.029745048572674047
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6360153256704981,
"acc_stderr": 0.017205684809032232,
"acc_norm": 0.6360153256704981,
"acc_norm_stderr": 0.017205684809032232
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4913294797687861,
"acc_stderr": 0.0269150473553698,
"acc_norm": 0.4913294797687861,
"acc_norm_stderr": 0.0269150473553698
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2748603351955307,
"acc_stderr": 0.014931316703220503,
"acc_norm": 0.2748603351955307,
"acc_norm_stderr": 0.014931316703220503
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.028555827516528784,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.028555827516528784
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5305466237942122,
"acc_stderr": 0.028345045864840625,
"acc_norm": 0.5305466237942122,
"acc_norm_stderr": 0.028345045864840625
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5,
"acc_stderr": 0.02782074420373286,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02782074420373286
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3262411347517731,
"acc_stderr": 0.027968453043563168,
"acc_norm": 0.3262411347517731,
"acc_norm_stderr": 0.027968453043563168
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3376792698826597,
"acc_stderr": 0.012078563777145552,
"acc_norm": 0.3376792698826597,
"acc_norm_stderr": 0.012078563777145552
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4338235294117647,
"acc_stderr": 0.030105636570016636,
"acc_norm": 0.4338235294117647,
"acc_norm_stderr": 0.030105636570016636
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.42810457516339867,
"acc_stderr": 0.020017629214213108,
"acc_norm": 0.42810457516339867,
"acc_norm_stderr": 0.020017629214213108
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972744,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972744
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5183673469387755,
"acc_stderr": 0.03198761546763127,
"acc_norm": 0.5183673469387755,
"acc_norm_stderr": 0.03198761546763127
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.681592039800995,
"acc_stderr": 0.03294118479054095,
"acc_norm": 0.681592039800995,
"acc_norm_stderr": 0.03294118479054095
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.41566265060240964,
"acc_stderr": 0.038367221765980515,
"acc_norm": 0.41566265060240964,
"acc_norm_stderr": 0.038367221765980515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4320429372007698,
"mc2_stderr": 0.014925535179229217
},
"harness|winogrande|5": {
"acc": 0.6685082872928176,
"acc_stderr": 0.01323039719896465
},
"harness|gsm8k|5": {
"acc": 0.29492039423805916,
"acc_stderr": 0.012560698010954767
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
adamo1139/toxic-dpo-natural-v3 | adamo1139 | "2024-04-07T14:11:54Z" | 0 | 0 | [
"license:other",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T14:11:45Z" | ---
license: other
license_name: other
license_link: LICENSE
---
|
liaad/machine_translation_dataset | liaad | "2024-04-08T10:32:37Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T14:47:37Z" | ---
dataset_info:
- config_name: journalistic
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 1312620204
num_examples: 1845205
download_size: 869897684
dataset_size: 1312620204
- config_name: legal
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': PT-PT
'1': PT-BR
splits:
- name: train
num_bytes: 149071750
num_examples: 477903
download_size: 80693729
dataset_size: 149071750
- config_name: literature
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 55905796
num_examples: 225
download_size: 34170187
dataset_size: 55905796
- config_name: politics
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 367519469
num_examples: 14328
download_size: 199770940
dataset_size: 367519469
- config_name: social_media
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 372374266
num_examples: 3074774
download_size: 267074829
dataset_size: 372374266
- config_name: web
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': PT-PT
'1': PT-BR
splits:
- name: train
num_bytes: 1373778486
num_examples: 279555
download_size: 674977136
dataset_size: 1373778486
configs:
- config_name: journalistic
data_files:
- split: train
path: journalistic/train-*
- config_name: legal
data_files:
- split: train
path: legal/train-*
- config_name: literature
data_files:
- split: train
path: literature/train-*
- config_name: politics
data_files:
- split: train
path: politics/train-*
- config_name: social_media
data_files:
- split: train
path: social_media/train-*
- config_name: web
data_files:
- split: train
path: web/train-*
---
|
wkinc/drupal-dataset | wkinc | "2024-04-07T15:13:04Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-07T15:13:04Z" | ---
license: apache-2.0
---
|
KagglingFace/FYP-KiTS-A-3dlowres | KagglingFace | "2024-04-07T15:14:28Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-07T15:14:28Z" | ---
license: mit
---
|
ChavyvAkvar/chai-reward-kto-trainer-v1 | ChavyvAkvar | "2024-04-07T15:26:02Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T15:20:06Z" | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: completion
dtype: string
- name: label
dtype: bool
splits:
- name: train
num_bytes: 65473456
num_examples: 33841
download_size: 37024405
dataset_size: 65473456
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ramixpe/sp_simple | ramixpe | "2024-04-07T15:21:37Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T15:21:34Z" | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 8694932
num_examples: 20551
download_size: 1989750
dataset_size: 8694932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
PLS442/Bolofos | PLS442 | "2024-04-07T15:44:10Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-07T15:43:39Z" | ---
license: openrail
---
|
lachieandmitch/hugging | lachieandmitch | "2024-04-07T15:48:48Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-07T15:48:48Z" | ---
license: apache-2.0
---
|
liaad/machine_translation_dataset_detokenized | liaad | "2024-04-08T10:33:34Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T15:52:15Z" | ---
dataset_info:
- config_name: journalistic
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 1283261148
num_examples: 1845205
download_size: 864052343
dataset_size: 1283261148
- config_name: legal
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': PT-PT
'1': PT-BR
splits:
- name: train
num_bytes: 148927683
num_examples: 477903
download_size: 91110976
dataset_size: 148927683
- config_name: literature
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 55646572
num_examples: 225
download_size: 19697267
dataset_size: 55646572
- config_name: politics
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 367487667
num_examples: 14328
download_size: 200081078
dataset_size: 367487667
- config_name: social_media
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': pt-PT
'1': pt-BR
splits:
- name: train
num_bytes: 371972738
num_examples: 3074774
download_size: 266674007
dataset_size: 371972738
- config_name: web
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': PT-PT
'1': PT-BR
splits:
- name: train
num_bytes: 1372865174
num_examples: 279555
download_size: 705408533
dataset_size: 1372865174
configs:
- config_name: journalistic
data_files:
- split: train
path: journalistic/train-*
- config_name: legal
data_files:
- split: train
path: legal/train-*
- config_name: literature
data_files:
- split: train
path: literature/train-*
- config_name: politics
data_files:
- split: train
path: politics/train-*
- config_name: social_media
data_files:
- split: train
path: social_media/train-*
- config_name: web
data_files:
- split: train
path: web/train-*
---
|
McSpicyWithMilo/instruction-type-cv | McSpicyWithMilo | "2024-04-07T15:53:22Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T15:53:16Z" | ---
dataset_info:
features:
- name: instruction_type
dtype: string
- name: instruction
dtype: string
splits:
- name: train
num_bytes: 35040
num_examples: 400
download_size: 15153
dataset_size: 35040
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Dataset Card for "instruction-type-cv"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
LimYeri/leetcode_with_youtube_captions | LimYeri | "2024-05-29T11:06:00Z" | 0 | 1 | [
"task_categories:text-classification",
"task_categories:text-generation",
"language:en",
"license:mit",
"size_categories:10K<n<100K",
"format:parquet",
"modality:image",
"modality:tabular",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us",
"code"
] | [
"text-classification",
"text-generation"
] | "2024-04-07T16:25:18Z" | ---
language:
- en
license: mit
size_categories:
- 10K<n<100K
task_categories:
- text-classification
- text-generation
pretty_name: Leetcode informations with youtube captions
tags:
- code
dataset_info:
features:
- name: cc_content
dtype: string
- name: id
dtype: int64
- name: thumbnail
dtype: string
- name: title
dtype: string
- name: question_content
dtype: string
- name: java
dtype: string
- name: c++
dtype: string
- name: python
dtype: string
- name: javascript
dtype: string
- name: title_slug
dtype: string
- name: tag
dtype: string
- name: level
dtype: string
- name: success_rate
dtype: float64
- name: total_submission
dtype: float64
- name: total_accepted
dtype: float64
- name: question_likes
dtype: float64
- name: question_dislikes
dtype: float64
- name: question_hints
dtype: string
- name: similar_question_ids
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 576312572
num_examples: 18136
download_size: 150441753
dataset_size: 576312572
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
CarlosMorales/news_bbc_international_conflicts | CarlosMorales | "2024-04-07T16:49:36Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T16:29:36Z" | ---
dataset_info:
features:
- name: conflict
dtype: string
- name: title
dtype: string
- name: published_date
dtype: string
- name: description
dtype: string
- name: section
dtype: string
- name: content
dtype: string
- name: link
dtype: string
- name: Name
dtype: string
- name: Representation
sequence: string
- name: Top_n_words
dtype: string
- name: Representative_document
dtype: bool
splits:
- name: train
num_bytes: 45095
num_examples: 23
download_size: 39726
dataset_size: 45095
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jungledude23/llama-subtitle-hallucinations | jungledude23 | "2024-04-07T16:33:50Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T16:33:41Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 36236776
num_examples: 16528
download_size: 6499965
dataset_size: 36236776
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Vishwanath0912/qa_en_hi | Vishwanath0912 | "2024-04-07T17:12:37Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T17:07:47Z" | ---
license: mit
---
|
open-llm-leaderboard-old/details_Antonio88__TaliML-7B-ITA-V.1.0.FINAL | open-llm-leaderboard-old | "2024-04-07T17:12:04Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T17:11:41Z" | ---
pretty_name: Evaluation run of Antonio88/TaliML-7B-ITA-V.1.0.FINAL
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Antonio88/TaliML-7B-ITA-V.1.0.FINAL](https://huggingface.co/Antonio88/TaliML-7B-ITA-V.1.0.FINAL)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Antonio88__TaliML-7B-ITA-V.1.0.FINAL\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-07T17:09:23.512847](https://huggingface.co/datasets/open-llm-leaderboard/details_Antonio88__TaliML-7B-ITA-V.1.0.FINAL/blob/main/results_2024-04-07T17-09-23.512847.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26227991564437625,\n\
\ \"acc_stderr\": 0.030963173407181795,\n \"acc_norm\": 0.2637869841194498,\n\
\ \"acc_norm_stderr\": 0.03178696554014809,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474202,\n \"mc2\": 0.487585070970592,\n\
\ \"mc2_stderr\": 0.016456631483411588\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.23293515358361774,\n \"acc_stderr\": 0.012352507042617403,\n\
\ \"acc_norm\": 0.25853242320819114,\n \"acc_norm_stderr\": 0.012794553754288675\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2991435968930492,\n\
\ \"acc_stderr\": 0.004569470678071266,\n \"acc_norm\": 0.37223660625373434,\n\
\ \"acc_norm_stderr\": 0.004824130528590597\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.04072314811876837,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.04072314811876837\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3223684210526316,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.3223684210526316,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2188679245283019,\n \"acc_stderr\": 0.02544786382510861,\n\
\ \"acc_norm\": 0.2188679245283019,\n \"acc_norm_stderr\": 0.02544786382510861\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2138728323699422,\n\
\ \"acc_stderr\": 0.03126511206173044,\n \"acc_norm\": 0.2138728323699422,\n\
\ \"acc_norm_stderr\": 0.03126511206173044\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082633,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082633\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.0261488180184245,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.0261488180184245\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438015,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438015\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.15079365079365079,\n\
\ \"acc_stderr\": 0.03200686497287392,\n \"acc_norm\": 0.15079365079365079,\n\
\ \"acc_norm_stderr\": 0.03200686497287392\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3161290322580645,\n\
\ \"acc_stderr\": 0.02645087448904277,\n \"acc_norm\": 0.3161290322580645,\n\
\ \"acc_norm_stderr\": 0.02645087448904277\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2955665024630542,\n \"acc_stderr\": 0.032104944337514575,\n\
\ \"acc_norm\": 0.2955665024630542,\n \"acc_norm_stderr\": 0.032104944337514575\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2545454545454545,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.2545454545454545,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.033322999210706444,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.033322999210706444\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.24870466321243523,\n \"acc_stderr\": 0.03119584087770029,\n\
\ \"acc_norm\": 0.24870466321243523,\n \"acc_norm_stderr\": 0.03119584087770029\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2358974358974359,\n \"acc_stderr\": 0.021525965407408726,\n\
\ \"acc_norm\": 0.2358974358974359,\n \"acc_norm_stderr\": 0.021525965407408726\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3445378151260504,\n \"acc_stderr\": 0.03086868260412163,\n \
\ \"acc_norm\": 0.3445378151260504,\n \"acc_norm_stderr\": 0.03086868260412163\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658754,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.017871217767790226,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.017871217767790226\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.25980392156862747,\n\
\ \"acc_stderr\": 0.030778554678693264,\n \"acc_norm\": 0.25980392156862747,\n\
\ \"acc_norm_stderr\": 0.030778554678693264\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.26582278481012656,\n \"acc_stderr\": 0.02875679962965834,\n\
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.02875679962965834\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.20179372197309417,\n\
\ \"acc_stderr\": 0.026936111912802273,\n \"acc_norm\": 0.20179372197309417,\n\
\ \"acc_norm_stderr\": 0.026936111912802273\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.22900763358778625,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.22900763358778625,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.033220157957767414,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.033220157957767414\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952687,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952687\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19658119658119658,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.19658119658119658,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.015889888362560486,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.015889888362560486\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2138728323699422,\n \"acc_stderr\": 0.022075709251757183,\n\
\ \"acc_norm\": 0.2138728323699422,\n \"acc_norm_stderr\": 0.022075709251757183\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.25163398692810457,\n \"acc_stderr\": 0.024848018263875195,\n\
\ \"acc_norm\": 0.25163398692810457,\n \"acc_norm_stderr\": 0.024848018263875195\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24437299035369775,\n\
\ \"acc_stderr\": 0.024406162094668882,\n \"acc_norm\": 0.24437299035369775,\n\
\ \"acc_norm_stderr\": 0.024406162094668882\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2932098765432099,\n \"acc_stderr\": 0.02532988817190092,\n\
\ \"acc_norm\": 0.2932098765432099,\n \"acc_norm_stderr\": 0.02532988817190092\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.02484792135806396,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.02484792135806396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23142112125162972,\n\
\ \"acc_stderr\": 0.01077146171157645,\n \"acc_norm\": 0.23142112125162972,\n\
\ \"acc_norm_stderr\": 0.01077146171157645\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4485294117647059,\n \"acc_stderr\": 0.030211479609121593,\n\
\ \"acc_norm\": 0.4485294117647059,\n \"acc_norm_stderr\": 0.030211479609121593\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2173202614379085,\n \"acc_stderr\": 0.01668482092914859,\n \
\ \"acc_norm\": 0.2173202614379085,\n \"acc_norm_stderr\": 0.01668482092914859\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.04013964554072774,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.04013964554072774\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.21890547263681592,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.21890547263681592,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.1927710843373494,\n\
\ \"acc_stderr\": 0.030709824050565274,\n \"acc_norm\": 0.1927710843373494,\n\
\ \"acc_norm_stderr\": 0.030709824050565274\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.17543859649122806,\n \"acc_stderr\": 0.029170885500727654,\n\
\ \"acc_norm\": 0.17543859649122806,\n \"acc_norm_stderr\": 0.029170885500727654\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474202,\n \"mc2\": 0.487585070970592,\n\
\ \"mc2_stderr\": 0.016456631483411588\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5343330702446725,\n \"acc_stderr\": 0.014019317531542567\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Antonio88/TaliML-7B-ITA-V.1.0.FINAL
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|arc:challenge|25_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|gsm8k|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hellaswag|10_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-09-23.512847.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T17-09-23.512847.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- '**/details_harness|winogrande|5_2024-04-07T17-09-23.512847.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-07T17-09-23.512847.parquet'
- config_name: results
data_files:
- split: 2024_04_07T17_09_23.512847
path:
- results_2024-04-07T17-09-23.512847.parquet
- split: latest
path:
- results_2024-04-07T17-09-23.512847.parquet
---
# Dataset Card for Evaluation run of Antonio88/TaliML-7B-ITA-V.1.0.FINAL
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Antonio88/TaliML-7B-ITA-V.1.0.FINAL](https://huggingface.co/Antonio88/TaliML-7B-ITA-V.1.0.FINAL) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Antonio88__TaliML-7B-ITA-V.1.0.FINAL",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-07T17:09:23.512847](https://huggingface.co/datasets/open-llm-leaderboard/details_Antonio88__TaliML-7B-ITA-V.1.0.FINAL/blob/main/results_2024-04-07T17-09-23.512847.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26227991564437625,
"acc_stderr": 0.030963173407181795,
"acc_norm": 0.2637869841194498,
"acc_norm_stderr": 0.03178696554014809,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474202,
"mc2": 0.487585070970592,
"mc2_stderr": 0.016456631483411588
},
"harness|arc:challenge|25": {
"acc": 0.23293515358361774,
"acc_stderr": 0.012352507042617403,
"acc_norm": 0.25853242320819114,
"acc_norm_stderr": 0.012794553754288675
},
"harness|hellaswag|10": {
"acc": 0.2991435968930492,
"acc_stderr": 0.004569470678071266,
"acc_norm": 0.37223660625373434,
"acc_norm_stderr": 0.004824130528590597
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.04072314811876837,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.04072314811876837
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3223684210526316,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.3223684210526316,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2188679245283019,
"acc_stderr": 0.02544786382510861,
"acc_norm": 0.2188679245283019,
"acc_norm_stderr": 0.02544786382510861
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.03126511206173044,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.03126511206173044
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082633,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082633
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2,
"acc_stderr": 0.0261488180184245,
"acc_norm": 0.2,
"acc_norm_stderr": 0.0261488180184245
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438015,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438015
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.15079365079365079,
"acc_stderr": 0.03200686497287392,
"acc_norm": 0.15079365079365079,
"acc_norm_stderr": 0.03200686497287392
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.02645087448904277,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.02645087448904277
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2955665024630542,
"acc_stderr": 0.032104944337514575,
"acc_norm": 0.2955665024630542,
"acc_norm_stderr": 0.032104944337514575
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2545454545454545,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.2545454545454545,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.033322999210706444,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.033322999210706444
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.24870466321243523,
"acc_stderr": 0.03119584087770029,
"acc_norm": 0.24870466321243523,
"acc_norm_stderr": 0.03119584087770029
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2358974358974359,
"acc_stderr": 0.021525965407408726,
"acc_norm": 0.2358974358974359,
"acc_norm_stderr": 0.021525965407408726
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3445378151260504,
"acc_stderr": 0.03086868260412163,
"acc_norm": 0.3445378151260504,
"acc_norm_stderr": 0.03086868260412163
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658754,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.017871217767790226,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.017871217767790226
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.030778554678693264,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.030778554678693264
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.02875679962965834,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.02875679962965834
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.20179372197309417,
"acc_stderr": 0.026936111912802273,
"acc_norm": 0.20179372197309417,
"acc_norm_stderr": 0.026936111912802273
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.22900763358778625,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.22900763358778625,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.033220157957767414,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.033220157957767414
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952687,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952687
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19658119658119658,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.19658119658119658,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.015889888362560486,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.015889888362560486
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2138728323699422,
"acc_stderr": 0.022075709251757183,
"acc_norm": 0.2138728323699422,
"acc_norm_stderr": 0.022075709251757183
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.25163398692810457,
"acc_stderr": 0.024848018263875195,
"acc_norm": 0.25163398692810457,
"acc_norm_stderr": 0.024848018263875195
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24437299035369775,
"acc_stderr": 0.024406162094668882,
"acc_norm": 0.24437299035369775,
"acc_norm_stderr": 0.024406162094668882
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2932098765432099,
"acc_stderr": 0.02532988817190092,
"acc_norm": 0.2932098765432099,
"acc_norm_stderr": 0.02532988817190092
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.02484792135806396,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.02484792135806396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23142112125162972,
"acc_stderr": 0.01077146171157645,
"acc_norm": 0.23142112125162972,
"acc_norm_stderr": 0.01077146171157645
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4485294117647059,
"acc_stderr": 0.030211479609121593,
"acc_norm": 0.4485294117647059,
"acc_norm_stderr": 0.030211479609121593
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2173202614379085,
"acc_stderr": 0.01668482092914859,
"acc_norm": 0.2173202614379085,
"acc_norm_stderr": 0.01668482092914859
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.04013964554072774,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.04013964554072774
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.21890547263681592,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.21890547263681592,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.1927710843373494,
"acc_stderr": 0.030709824050565274,
"acc_norm": 0.1927710843373494,
"acc_norm_stderr": 0.030709824050565274
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.17543859649122806,
"acc_stderr": 0.029170885500727654,
"acc_norm": 0.17543859649122806,
"acc_norm_stderr": 0.029170885500727654
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474202,
"mc2": 0.487585070970592,
"mc2_stderr": 0.016456631483411588
},
"harness|winogrande|5": {
"acc": 0.5343330702446725,
"acc_stderr": 0.014019317531542567
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jp1924/JejuSpeech | jp1924 | "2024-04-08T04:33:52Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:audio",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T17:40:17Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
- name: standard_form
dtype: string
- name: dialect_form
dtype: string
- name: start
dtype: float32
- name: end
dtype: float32
- name: note
dtype: string
- name: eojeolList
list:
- name: id
dtype: int8
- name: eojeol
dtype: string
- name: standard
dtype: string
- name: isDialect
dtype: bool
- name: speaker
struct:
- name: id
dtype: string
- name: name
dtype: string
- name: age
dtype: string
- name: occupation
dtype: string
- name: sex
dtype: string
- name: birthplace
dtype: string
- name: principal_residence
dtype: string
- name: current_residence
dtype: string
- name: education
dtype: string
- name: metadata
struct:
- name: title
dtype: string
- name: creator
dtype: string
- name: distributor
dtype: string
- name: year
dtype: string
- name: category
dtype: string
- name: annotation_level
list: string
- name: sampling
dtype: string
- name: author
dtype: string
- name: publisher
dtype: string
- name: date
dtype: string
- name: topic
dtype: string
splits:
- name: train
num_bytes: 730867084925.104
num_examples: 2774257
- name: validation
num_bytes: 90310808652.344
num_examples: 333802
download_size: 786625196812
dataset_size: 821177893577.448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
baratilab/Eagar-Tsai-Old | baratilab | "2024-04-14T00:26:48Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T18:17:58Z" | ---
dataset_info:
- config_name: baseline
features:
- name: thetas
sequence:
sequence:
sequence:
sequence: float64
- name: times
sequence: float64
splits:
- name: m_Ti6Al4V_v_0.0_0.1_0.1
num_bytes: 117932360
num_examples: 2
- name: m_Ti6Al4V_v_0.0_2.9_0.1
num_bytes: 1768985400
num_examples: 30
download_size: 160779284
dataset_size: 1886917760
- config_name: default
features:
- name: depths
sequence:
sequence: float64
- name: lengths
sequence:
sequence: float64
- name: widths
sequence:
sequence: float64
- name: powers
sequence: int64
- name: velocities
sequence: float64
splits:
- name: train
num_bytes: 18760
num_examples: 1
download_size: 17808
dataset_size: 18760
- config_name: process_maps
features:
- name: args
struct:
- name: absorptivity
dtype: float64
- name: bound_x_start
dtype: float64
- name: bound_x_stop
dtype: float64
- name: bound_y_start
dtype: float64
- name: bound_y_stop
dtype: float64
- name: bound_z_start
dtype: float64
- name: bound_z_stop
dtype: int64
- name: cp
dtype: float64
- name: d_beam
dtype: float64
- name: k
dtype: float64
- name: material
dtype: string
- name: mesh_resolution
dtype: float64
- name: power_start
dtype: int64
- name: power_step
dtype: int64
- name: power_stop
dtype: int64
- name: rho
dtype: float64
- name: t_melt
dtype: float64
- name: velocity_start
dtype: float64
- name: velocity_step
dtype: float64
- name: velocity_stop
dtype: float64
- name: verbose
dtype: bool
- name: widths
sequence:
sequence: float64
- name: depths
sequence:
sequence: float64
- name: lengths
sequence:
sequence: float64
- name: powers
sequence: int64
- name: velocities
sequence: float64
splits:
- name: m_Ti64_p_0_20_20_v_0.0_0.0_0.1
num_bytes: 277
num_examples: 1
- name: m_Ti64_p_0_480_20_v_0.0_2.9_0.1
num_bytes: 18921
num_examples: 1
- name: m_Ti64_p_0_480_20_v_0.0_2.9_0.1_a_0.27
num_bytes: 18921
num_examples: 1
download_size: 85881
dataset_size: 38119
- config_name: process_maps_baseline
features:
- name: args
struct:
- name: bound_x_start
dtype: float64
- name: bound_x_stop
dtype: float64
- name: bound_y_start
dtype: float64
- name: bound_y_stop
dtype: float64
- name: bound_z_start
dtype: float64
- name: bound_z_stop
dtype: int64
- name: d_beam
dtype: float64
- name: material
dtype: string
- name: mesh_resolution
dtype: float64
- name: power_start
dtype: int64
- name: power_step
dtype: int64
- name: power_stop
dtype: int64
- name: velocity_start
dtype: float64
- name: velocity_step
dtype: float64
- name: velocity_stop
dtype: float64
- name: verbose
dtype: bool
- name: widths
sequence:
sequence: int64
- name: depths
sequence:
sequence: int64
- name: lengths
sequence:
sequence: int64
- name: powers
sequence: int64
- name: velocities
sequence: float64
splits:
- name: m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1
num_bytes: 18886
num_examples: 1
- name: m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1
num_bytes: 1862
num_examples: 1
download_size: 156884
dataset_size: 20748
- config_name: process_maps_baseline_test
features:
- name: args
struct:
- name: bound_x_start
dtype: float64
- name: bound_x_stop
dtype: float64
- name: bound_y_start
dtype: float64
- name: bound_y_stop
dtype: float64
- name: bound_z_start
dtype: float64
- name: bound_z_stop
dtype: int64
- name: d_beam
dtype: float64
- name: material
dtype: string
- name: mesh_resolution
dtype: float64
- name: power_start
dtype: int64
- name: power_step
dtype: int64
- name: power_stop
dtype: int64
- name: velocity_start
dtype: float64
- name: velocity_step
dtype: float64
- name: velocity_stop
dtype: float64
- name: verbose
dtype: bool
- name: widths
sequence:
sequence: float64
- name: depths
sequence:
sequence: float64
- name: lengths
sequence:
sequence: float64
- name: powers
sequence: int64
- name: velocities
sequence: float64
splits:
- name: m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1
num_bytes: 1862
num_examples: 1
- name: m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1
num_bytes: 18886
num_examples: 1
download_size: 46155
dataset_size: 20748
- config_name: process_maps_baseline_test_2
features:
- name: args
struct:
- name: bound_x_start
dtype: float64
- name: bound_x_stop
dtype: float64
- name: bound_y_start
dtype: float64
- name: bound_y_stop
dtype: float64
- name: bound_z_start
dtype: float64
- name: bound_z_stop
dtype: int64
- name: d_beam
dtype: float64
- name: material
dtype: string
- name: mesh_resolution
dtype: float64
- name: power_start
dtype: int64
- name: power_step
dtype: int64
- name: power_stop
dtype: int64
- name: velocity_start
dtype: float64
- name: velocity_step
dtype: float64
- name: velocity_stop
dtype: float64
- name: verbose
dtype: bool
- name: widths
sequence:
sequence: int64
- name: depths
sequence:
sequence: int64
- name: lengths
sequence:
sequence: int64
- name: powers
sequence: int64
- name: velocities
sequence: float64
splits:
- name: m_IN625_p_0_480_20_v_0.0_2.9_0.1
num_bytes: 18882
num_examples: 1
download_size: 13218
dataset_size: 18882
- config_name: process_maps_baseline_test_3
features:
- name: args
struct:
- name: bound_x_start
dtype: float64
- name: bound_x_stop
dtype: float64
- name: bound_y_start
dtype: float64
- name: bound_y_stop
dtype: float64
- name: bound_z_start
dtype: float64
- name: bound_z_stop
dtype: int64
- name: d_beam
dtype: float64
- name: material
dtype: string
- name: mesh_resolution
dtype: float64
- name: power_start
dtype: int64
- name: power_step
dtype: int64
- name: power_stop
dtype: int64
- name: velocity_start
dtype: float64
- name: velocity_step
dtype: float64
- name: velocity_stop
dtype: float64
- name: verbose
dtype: bool
- name: widths
sequence:
sequence: float64
- name: depths
sequence:
sequence: float64
- name: lengths
sequence:
sequence: float64
- name: powers
sequence: int64
- name: velocities
sequence: float64
splits:
- name: m_IN625_p_0_480_20_v_0.0_2.9_0.1
num_bytes: 18882
num_examples: 1
download_size: 33163
dataset_size: 18882
- config_name: process_maps_test
features:
- name: args
struct:
- name: bound_x_start
dtype: float64
- name: bound_x_stop
dtype: float64
- name: bound_y_start
dtype: float64
- name: bound_y_stop
dtype: float64
- name: bound_z_start
dtype: float64
- name: bound_z_stop
dtype: int64
- name: d_beam
dtype: float64
- name: material
dtype: string
- name: mesh_resolution
dtype: float64
- name: power_start
dtype: int64
- name: power_step
dtype: int64
- name: power_stop
dtype: int64
- name: velocity_start
dtype: float64
- name: velocity_step
dtype: float64
- name: velocity_stop
dtype: float64
- name: verbose
dtype: bool
- name: widths
sequence:
sequence: int64
- name: depths
sequence:
sequence: int64
- name: lengths
sequence:
sequence: int64
- name: powers
sequence: int64
- name: velocities
sequence: float64
splits:
- name: m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1
num_bytes: 1862
num_examples: 1
- name: m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1
num_bytes: 18886
num_examples: 1
download_size: 26304
dataset_size: 20748
- config_name: simulations
features:
- name: args
struct:
- name: absorptivity
dtype: float64
- name: bound_x_start
dtype: float64
- name: bound_x_stop
dtype: float64
- name: bound_y_start
dtype: float64
- name: bound_y_stop
dtype: float64
- name: bound_z_start
dtype: float64
- name: bound_z_stop
dtype: int64
- name: cp
dtype: float64
- name: d_beam
dtype: float64
- name: k
dtype: float64
- name: material
dtype: string
- name: mesh_resolution
dtype: float64
- name: power_start
dtype: int64
- name: power_step
dtype: int64
- name: power_stop
dtype: int64
- name: rho
dtype: float64
- name: t_melt
dtype: float64
- name: velocity_start
dtype: float64
- name: velocity_step
dtype: float64
- name: velocity_stop
dtype: float64
- name: verbose
dtype: bool
- name: width
dtype: float64
- name: depth
dtype: float64
- name: length
dtype: float64
- name: time
dtype: float64
- name: theta
sequence:
sequence:
sequence: float64
- name: power
dtype: int64
- name: velocity
dtype: float64
splits:
- name: m_Ti64_p_0_v_0.0
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_0_v_0.1
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_0_v_0.2
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_0_v_0.3
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_0_v_0.4
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_0_v_0.5
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_0_v_0.6
num_bytes: 182566661
num_examples: 5
- name: m_Ti64_p_0_v_0.7
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_0_v_0.8
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_0_v_0.9
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_0_v_1.0
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_0_v_1.1
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_0_v_1.2
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_0_v_1.3
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_0_v_1.4
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_0_v_1.5
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_0_v_1.6
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_0_v_1.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_1.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_1.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.0
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.1
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.2
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.3
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.4
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.5
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.6
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_0_v_2.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_0.0
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_20_v_0.1
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_20_v_0.2
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_20_v_0.3
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_20_v_0.4
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_20_v_0.5
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_20_v_0.6
num_bytes: 182566661
num_examples: 5
- name: m_Ti64_p_20_v_0.7
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_20_v_0.8
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_20_v_0.9
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_20_v_1.0
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_20_v_1.1
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_20_v_1.2
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_20_v_1.3
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_20_v_1.4
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_20_v_1.5
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_20_v_1.6
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_20_v_1.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_1.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_1.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.0
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.1
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.2
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.3
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.4
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.5
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.6
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_20_v_2.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_0.0
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_40_v_0.1
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_40_v_0.2
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_40_v_0.3
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_40_v_0.4
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_40_v_0.5
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_40_v_0.6
num_bytes: 182566661
num_examples: 5
- name: m_Ti64_p_40_v_0.7
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_40_v_0.8
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_40_v_0.9
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_40_v_1.0
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_40_v_1.1
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_40_v_1.2
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_40_v_1.3
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_40_v_1.4
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_40_v_1.5
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_40_v_1.6
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_40_v_1.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_1.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_1.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.0
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.1
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.2
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.3
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.4
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.5
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.6
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_40_v_2.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_0.0
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_60_v_0.1
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_60_v_0.2
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_60_v_0.3
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_60_v_0.4
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_60_v_0.5
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_60_v_0.6
num_bytes: 182566661
num_examples: 5
- name: m_Ti64_p_60_v_0.7
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_60_v_0.8
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_60_v_0.9
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_60_v_1.0
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_60_v_1.1
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_60_v_1.2
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_60_v_1.3
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_60_v_1.4
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_60_v_1.5
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_60_v_1.6
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_60_v_1.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_1.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_1.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.0
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.1
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.2
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.3
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.4
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.5
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.6
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_60_v_2.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_80_v_0.0
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_80_v_0.1
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_80_v_0.2
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_80_v_0.3
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_80_v_0.4
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_80_v_0.5
num_bytes: 219079993
num_examples: 6
- name: m_Ti64_p_80_v_0.6
num_bytes: 182566661
num_examples: 5
- name: m_Ti64_p_80_v_0.7
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_80_v_0.8
num_bytes: 146053329
num_examples: 4
- name: m_Ti64_p_80_v_0.9
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_80_v_1.0
num_bytes: 109539997
num_examples: 3
- name: m_Ti64_p_80_v_1.1
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_80_v_1.2
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_80_v_1.3
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_80_v_1.4
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_80_v_1.5
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_80_v_1.6
num_bytes: 73026665
num_examples: 2
- name: m_Ti64_p_80_v_1.7
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_80_v_1.8
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_80_v_1.9
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_80_v_2.0
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_80_v_2.1
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_80_v_2.2
num_bytes: 36513333
num_examples: 1
- name: m_Ti64_p_80_v_2.3
num_bytes: 36513333
num_examples: 1
download_size: 4445609703
dataset_size: 14386252952
configs:
- config_name: baseline
data_files:
- split: m_Ti6Al4V_v_0.0_0.1_0.1
path: baseline/m_Ti6Al4V_v_0.0_0.1_0.1-*
- split: m_Ti6Al4V_v_0.0_2.9_0.1
path: baseline/m_Ti6Al4V_v_0.0_2.9_0.1-*
- config_name: default
data_files:
- split: train
path: data/train-*
- config_name: process_maps
data_files:
- split: m_Ti64_p_0_20_20_v_0.0_0.0_0.1
path: process_maps/m_Ti64_p_0_20_20_v_0.0_0.0_0.1-*
- split: m_Ti64_p_0_480_20_v_0.0_2.9_0.1
path: process_maps/m_Ti64_p_0_480_20_v_0.0_2.9_0.1-*
- split: m_Ti64_p_0_480_20_v_0.0_2.9_0.1_a_0.27
path: process_maps/m_Ti64_p_0_480_20_v_0.0_2.9_0.1_a_0.27-*
- config_name: process_maps_baseline
data_files:
- split: m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1
path: process_maps_baseline/m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1-*
- split: m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1
path: process_maps_baseline/m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1-*
- config_name: process_maps_baseline_test
data_files:
- split: m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1
path: process_maps_baseline_test/m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1-*
- split: m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1
path: process_maps_baseline_test/m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1-*
- config_name: process_maps_baseline_test_2
data_files:
- split: m_IN625_p_0_480_20_v_0.0_2.9_0.1
path: process_maps_baseline_test_2/m_IN625_p_0_480_20_v_0.0_2.9_0.1-*
- config_name: process_maps_baseline_test_3
data_files:
- split: m_IN625_p_0_480_20_v_0.0_2.9_0.1
path: process_maps_baseline_test_3/m_IN625_p_0_480_20_v_0.0_2.9_0.1-*
- config_name: process_maps_test
data_files:
- split: m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1
path: process_maps_test/m_Ti6Al4V_p_0_480_20_v_0.0_0.1_0.1-*
- split: m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1
path: process_maps_test/m_Ti6Al4V_p_0_480_20_v_0.0_2.9_0.1-*
- config_name: simulations
data_files:
- split: m_Ti64_p_0_v_0.0
path: simulations/m_Ti64_p_0_v_0.0-*
- split: m_Ti64_p_20_v_0.0
path: simulations/m_Ti64_p_20_v_0.0-*
- split: m_Ti64_p_0_v_0.1
path: simulations/m_Ti64_p_0_v_0.1-*
- split: m_Ti64_p_0_v_0.2
path: simulations/m_Ti64_p_0_v_0.2-*
- split: m_Ti64_p_0_v_0.3
path: simulations/m_Ti64_p_0_v_0.3-*
- split: m_Ti64_p_0_v_0.4
path: simulations/m_Ti64_p_0_v_0.4-*
- split: m_Ti64_p_0_v_0.5
path: simulations/m_Ti64_p_0_v_0.5-*
- split: m_Ti64_p_0_v_0.6
path: simulations/m_Ti64_p_0_v_0.6-*
- split: m_Ti64_p_0_v_0.7
path: simulations/m_Ti64_p_0_v_0.7-*
- split: m_Ti64_p_0_v_0.8
path: simulations/m_Ti64_p_0_v_0.8-*
- split: m_Ti64_p_0_v_0.9
path: simulations/m_Ti64_p_0_v_0.9-*
- split: m_Ti64_p_0_v_1.0
path: simulations/m_Ti64_p_0_v_1.0-*
- split: m_Ti64_p_0_v_1.1
path: simulations/m_Ti64_p_0_v_1.1-*
- split: m_Ti64_p_0_v_1.2
path: simulations/m_Ti64_p_0_v_1.2-*
- split: m_Ti64_p_0_v_1.3
path: simulations/m_Ti64_p_0_v_1.3-*
- split: m_Ti64_p_0_v_1.4
path: simulations/m_Ti64_p_0_v_1.4-*
- split: m_Ti64_p_0_v_1.5
path: simulations/m_Ti64_p_0_v_1.5-*
- split: m_Ti64_p_0_v_1.6
path: simulations/m_Ti64_p_0_v_1.6-*
- split: m_Ti64_p_0_v_1.7
path: simulations/m_Ti64_p_0_v_1.7-*
- split: m_Ti64_p_0_v_1.8
path: simulations/m_Ti64_p_0_v_1.8-*
- split: m_Ti64_p_0_v_1.9
path: simulations/m_Ti64_p_0_v_1.9-*
- split: m_Ti64_p_0_v_2.0
path: simulations/m_Ti64_p_0_v_2.0-*
- split: m_Ti64_p_0_v_2.1
path: simulations/m_Ti64_p_0_v_2.1-*
- split: m_Ti64_p_0_v_2.2
path: simulations/m_Ti64_p_0_v_2.2-*
- split: m_Ti64_p_0_v_2.3
path: simulations/m_Ti64_p_0_v_2.3-*
- split: m_Ti64_p_0_v_2.4
path: simulations/m_Ti64_p_0_v_2.4-*
- split: m_Ti64_p_0_v_2.5
path: simulations/m_Ti64_p_0_v_2.5-*
- split: m_Ti64_p_0_v_2.6
path: simulations/m_Ti64_p_0_v_2.6-*
- split: m_Ti64_p_0_v_2.7
path: simulations/m_Ti64_p_0_v_2.7-*
- split: m_Ti64_p_0_v_2.8
path: simulations/m_Ti64_p_0_v_2.8-*
- split: m_Ti64_p_0_v_2.9
path: simulations/m_Ti64_p_0_v_2.9-*
- split: m_Ti64_p_20_v_0.1
path: simulations/m_Ti64_p_20_v_0.1-*
- split: m_Ti64_p_20_v_0.2
path: simulations/m_Ti64_p_20_v_0.2-*
- split: m_Ti64_p_20_v_0.3
path: simulations/m_Ti64_p_20_v_0.3-*
- split: m_Ti64_p_20_v_0.4
path: simulations/m_Ti64_p_20_v_0.4-*
- split: m_Ti64_p_20_v_0.5
path: simulations/m_Ti64_p_20_v_0.5-*
- split: m_Ti64_p_20_v_0.6
path: simulations/m_Ti64_p_20_v_0.6-*
- split: m_Ti64_p_20_v_0.7
path: simulations/m_Ti64_p_20_v_0.7-*
- split: m_Ti64_p_20_v_0.8
path: simulations/m_Ti64_p_20_v_0.8-*
- split: m_Ti64_p_20_v_0.9
path: simulations/m_Ti64_p_20_v_0.9-*
- split: m_Ti64_p_20_v_1.0
path: simulations/m_Ti64_p_20_v_1.0-*
- split: m_Ti64_p_20_v_1.1
path: simulations/m_Ti64_p_20_v_1.1-*
- split: m_Ti64_p_20_v_1.2
path: simulations/m_Ti64_p_20_v_1.2-*
- split: m_Ti64_p_20_v_1.3
path: simulations/m_Ti64_p_20_v_1.3-*
- split: m_Ti64_p_20_v_1.4
path: simulations/m_Ti64_p_20_v_1.4-*
- split: m_Ti64_p_20_v_1.5
path: simulations/m_Ti64_p_20_v_1.5-*
- split: m_Ti64_p_20_v_1.6
path: simulations/m_Ti64_p_20_v_1.6-*
- split: m_Ti64_p_20_v_1.7
path: simulations/m_Ti64_p_20_v_1.7-*
- split: m_Ti64_p_20_v_1.8
path: simulations/m_Ti64_p_20_v_1.8-*
- split: m_Ti64_p_20_v_1.9
path: simulations/m_Ti64_p_20_v_1.9-*
- split: m_Ti64_p_20_v_2.0
path: simulations/m_Ti64_p_20_v_2.0-*
- split: m_Ti64_p_20_v_2.1
path: simulations/m_Ti64_p_20_v_2.1-*
- split: m_Ti64_p_20_v_2.2
path: simulations/m_Ti64_p_20_v_2.2-*
- split: m_Ti64_p_20_v_2.3
path: simulations/m_Ti64_p_20_v_2.3-*
- split: m_Ti64_p_20_v_2.4
path: simulations/m_Ti64_p_20_v_2.4-*
- split: m_Ti64_p_20_v_2.5
path: simulations/m_Ti64_p_20_v_2.5-*
- split: m_Ti64_p_20_v_2.6
path: simulations/m_Ti64_p_20_v_2.6-*
- split: m_Ti64_p_20_v_2.7
path: simulations/m_Ti64_p_20_v_2.7-*
- split: m_Ti64_p_20_v_2.8
path: simulations/m_Ti64_p_20_v_2.8-*
- split: m_Ti64_p_20_v_2.9
path: simulations/m_Ti64_p_20_v_2.9-*
- split: m_Ti64_p_40_v_0.0
path: simulations/m_Ti64_p_40_v_0.0-*
- split: m_Ti64_p_40_v_0.1
path: simulations/m_Ti64_p_40_v_0.1-*
- split: m_Ti64_p_40_v_0.2
path: simulations/m_Ti64_p_40_v_0.2-*
- split: m_Ti64_p_40_v_0.3
path: simulations/m_Ti64_p_40_v_0.3-*
- split: m_Ti64_p_40_v_0.4
path: simulations/m_Ti64_p_40_v_0.4-*
- split: m_Ti64_p_40_v_0.5
path: simulations/m_Ti64_p_40_v_0.5-*
- split: m_Ti64_p_40_v_0.6
path: simulations/m_Ti64_p_40_v_0.6-*
- split: m_Ti64_p_40_v_0.7
path: simulations/m_Ti64_p_40_v_0.7-*
- split: m_Ti64_p_40_v_0.8
path: simulations/m_Ti64_p_40_v_0.8-*
- split: m_Ti64_p_40_v_0.9
path: simulations/m_Ti64_p_40_v_0.9-*
- split: m_Ti64_p_40_v_1.0
path: simulations/m_Ti64_p_40_v_1.0-*
- split: m_Ti64_p_40_v_1.1
path: simulations/m_Ti64_p_40_v_1.1-*
- split: m_Ti64_p_40_v_1.2
path: simulations/m_Ti64_p_40_v_1.2-*
- split: m_Ti64_p_40_v_1.3
path: simulations/m_Ti64_p_40_v_1.3-*
- split: m_Ti64_p_40_v_1.4
path: simulations/m_Ti64_p_40_v_1.4-*
- split: m_Ti64_p_40_v_1.5
path: simulations/m_Ti64_p_40_v_1.5-*
- split: m_Ti64_p_40_v_1.6
path: simulations/m_Ti64_p_40_v_1.6-*
- split: m_Ti64_p_40_v_1.7
path: simulations/m_Ti64_p_40_v_1.7-*
- split: m_Ti64_p_40_v_1.8
path: simulations/m_Ti64_p_40_v_1.8-*
- split: m_Ti64_p_40_v_1.9
path: simulations/m_Ti64_p_40_v_1.9-*
- split: m_Ti64_p_40_v_2.0
path: simulations/m_Ti64_p_40_v_2.0-*
- split: m_Ti64_p_40_v_2.1
path: simulations/m_Ti64_p_40_v_2.1-*
- split: m_Ti64_p_40_v_2.2
path: simulations/m_Ti64_p_40_v_2.2-*
- split: m_Ti64_p_40_v_2.3
path: simulations/m_Ti64_p_40_v_2.3-*
- split: m_Ti64_p_40_v_2.4
path: simulations/m_Ti64_p_40_v_2.4-*
- split: m_Ti64_p_40_v_2.5
path: simulations/m_Ti64_p_40_v_2.5-*
- split: m_Ti64_p_40_v_2.6
path: simulations/m_Ti64_p_40_v_2.6-*
- split: m_Ti64_p_40_v_2.7
path: simulations/m_Ti64_p_40_v_2.7-*
- split: m_Ti64_p_40_v_2.8
path: simulations/m_Ti64_p_40_v_2.8-*
- split: m_Ti64_p_40_v_2.9
path: simulations/m_Ti64_p_40_v_2.9-*
- split: m_Ti64_p_60_v_0.0
path: simulations/m_Ti64_p_60_v_0.0-*
- split: m_Ti64_p_60_v_0.1
path: simulations/m_Ti64_p_60_v_0.1-*
- split: m_Ti64_p_60_v_0.2
path: simulations/m_Ti64_p_60_v_0.2-*
- split: m_Ti64_p_60_v_0.3
path: simulations/m_Ti64_p_60_v_0.3-*
- split: m_Ti64_p_60_v_0.4
path: simulations/m_Ti64_p_60_v_0.4-*
- split: m_Ti64_p_60_v_0.5
path: simulations/m_Ti64_p_60_v_0.5-*
- split: m_Ti64_p_60_v_0.6
path: simulations/m_Ti64_p_60_v_0.6-*
- split: m_Ti64_p_60_v_0.7
path: simulations/m_Ti64_p_60_v_0.7-*
- split: m_Ti64_p_60_v_0.8
path: simulations/m_Ti64_p_60_v_0.8-*
- split: m_Ti64_p_60_v_0.9
path: simulations/m_Ti64_p_60_v_0.9-*
- split: m_Ti64_p_60_v_1.0
path: simulations/m_Ti64_p_60_v_1.0-*
- split: m_Ti64_p_60_v_1.1
path: simulations/m_Ti64_p_60_v_1.1-*
- split: m_Ti64_p_60_v_1.2
path: simulations/m_Ti64_p_60_v_1.2-*
- split: m_Ti64_p_60_v_1.3
path: simulations/m_Ti64_p_60_v_1.3-*
- split: m_Ti64_p_60_v_1.4
path: simulations/m_Ti64_p_60_v_1.4-*
- split: m_Ti64_p_60_v_1.5
path: simulations/m_Ti64_p_60_v_1.5-*
- split: m_Ti64_p_60_v_1.6
path: simulations/m_Ti64_p_60_v_1.6-*
- split: m_Ti64_p_60_v_1.7
path: simulations/m_Ti64_p_60_v_1.7-*
- split: m_Ti64_p_60_v_1.8
path: simulations/m_Ti64_p_60_v_1.8-*
- split: m_Ti64_p_60_v_1.9
path: simulations/m_Ti64_p_60_v_1.9-*
- split: m_Ti64_p_60_v_2.0
path: simulations/m_Ti64_p_60_v_2.0-*
- split: m_Ti64_p_60_v_2.1
path: simulations/m_Ti64_p_60_v_2.1-*
- split: m_Ti64_p_60_v_2.2
path: simulations/m_Ti64_p_60_v_2.2-*
- split: m_Ti64_p_60_v_2.3
path: simulations/m_Ti64_p_60_v_2.3-*
- split: m_Ti64_p_60_v_2.4
path: simulations/m_Ti64_p_60_v_2.4-*
- split: m_Ti64_p_60_v_2.5
path: simulations/m_Ti64_p_60_v_2.5-*
- split: m_Ti64_p_60_v_2.6
path: simulations/m_Ti64_p_60_v_2.6-*
- split: m_Ti64_p_60_v_2.7
path: simulations/m_Ti64_p_60_v_2.7-*
- split: m_Ti64_p_60_v_2.8
path: simulations/m_Ti64_p_60_v_2.8-*
- split: m_Ti64_p_60_v_2.9
path: simulations/m_Ti64_p_60_v_2.9-*
- split: m_Ti64_p_80_v_0.0
path: simulations/m_Ti64_p_80_v_0.0-*
- split: m_Ti64_p_80_v_0.1
path: simulations/m_Ti64_p_80_v_0.1-*
- split: m_Ti64_p_80_v_0.2
path: simulations/m_Ti64_p_80_v_0.2-*
- split: m_Ti64_p_80_v_0.3
path: simulations/m_Ti64_p_80_v_0.3-*
- split: m_Ti64_p_80_v_0.4
path: simulations/m_Ti64_p_80_v_0.4-*
- split: m_Ti64_p_80_v_0.5
path: simulations/m_Ti64_p_80_v_0.5-*
- split: m_Ti64_p_80_v_0.6
path: simulations/m_Ti64_p_80_v_0.6-*
- split: m_Ti64_p_80_v_0.7
path: simulations/m_Ti64_p_80_v_0.7-*
- split: m_Ti64_p_80_v_0.8
path: simulations/m_Ti64_p_80_v_0.8-*
- split: m_Ti64_p_80_v_0.9
path: simulations/m_Ti64_p_80_v_0.9-*
- split: m_Ti64_p_80_v_1.0
path: simulations/m_Ti64_p_80_v_1.0-*
- split: m_Ti64_p_80_v_1.1
path: simulations/m_Ti64_p_80_v_1.1-*
- split: m_Ti64_p_80_v_1.2
path: simulations/m_Ti64_p_80_v_1.2-*
- split: m_Ti64_p_80_v_1.3
path: simulations/m_Ti64_p_80_v_1.3-*
- split: m_Ti64_p_80_v_1.4
path: simulations/m_Ti64_p_80_v_1.4-*
- split: m_Ti64_p_80_v_1.5
path: simulations/m_Ti64_p_80_v_1.5-*
- split: m_Ti64_p_80_v_1.6
path: simulations/m_Ti64_p_80_v_1.6-*
- split: m_Ti64_p_80_v_1.7
path: simulations/m_Ti64_p_80_v_1.7-*
- split: m_Ti64_p_80_v_1.8
path: simulations/m_Ti64_p_80_v_1.8-*
- split: m_Ti64_p_80_v_1.9
path: simulations/m_Ti64_p_80_v_1.9-*
- split: m_Ti64_p_80_v_2.0
path: simulations/m_Ti64_p_80_v_2.0-*
- split: m_Ti64_p_80_v_2.1
path: simulations/m_Ti64_p_80_v_2.1-*
- split: m_Ti64_p_80_v_2.2
path: simulations/m_Ti64_p_80_v_2.2-*
- split: m_Ti64_p_80_v_2.3
path: simulations/m_Ti64_p_80_v_2.3-*
---
|
Xmaster6y/Mixtral-8x400M-v0.1-activations | Xmaster6y | "2024-04-07T22:11:03Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"modality:timeseries",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T18:46:39Z" | ---
dataset_info:
features:
- name: activation
sequence: float32
- name: domain
dtype: string
- name: expert
dtype: int64
splits:
- name: model.layers.0
num_bytes: 2682644235
num_examples: 163509
- name: model.layers.1
num_bytes: 2686548772
num_examples: 163747
download_size: 2573049193
dataset_size: 5369193007
configs:
- config_name: default
data_files:
- split: model.layers.0
path: data/model.layers.0-*
- split: model.layers.1
path: data/model.layers.1-*
---
|
oza75/mt-fr-bm-texts | oza75 | "2024-04-23T12:26:17Z" | 0 | 1 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T18:54:30Z" | ---
dataset_info:
- config_name: dictionnary
features:
- name: bambara
dtype: string
- name: french
dtype: string
splits:
- name: train
num_bytes: 42871
num_examples: 636
download_size: 26609
dataset_size: 42871
- config_name: main
features:
- name: bambara
dtype: string
- name: french
dtype: string
splits:
- name: train
num_bytes: 8363882
num_examples: 77307
download_size: 4821696
dataset_size: 8363882
- config_name: synthetic
features:
- name: French
dtype: string
- name: Bambara
dtype: string
splits:
- name: train
num_bytes: 39287189
num_examples: 79376
download_size: 22597095
dataset_size: 39287189
- config_name: transcriptions
features:
- name: bambara
dtype: string
- name: french
dtype: string
splits:
- name: train
num_bytes: 377011
num_examples: 4129
download_size: 226726
dataset_size: 377011
configs:
- config_name: dictionnary
data_files:
- split: train
path: dictionnary/train-*
- config_name: main
data_files:
- split: train
path: main/train-*
- config_name: synthetic
data_files:
- split: train
path: synthetic/train-*
- config_name: transcriptions
data_files:
- split: train
path: transcriptions/train-*
---
|
AnonymousAuthorICAIF24/Instruction_Input_dataset_07_04 | AnonymousAuthorICAIF24 | "2024-04-07T18:57:12Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T18:57:09Z" | ---
dataset_info:
features:
- name: Instruction
dtype: string
- name: Input
dtype: string
splits:
- name: train
num_bytes: 297331
num_examples: 29
download_size: 141941
dataset_size: 297331
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arbml/ArSL21L | arbml | "2024-04-07T19:34:57Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:image",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T19:03:38Z" | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': ain
'1': al
'2': aleff
'3': bb
'4': dal
'5': dha
'6': dhad
'7': fa
'8': gaaf
'9': ghain
'10': ha
'11': haa
'12': jeem
'13': kaaf
'14': khaa
'15': la
'16': laam
'17': meem
'18': nun
'19': ra
'20': saad
'21': seen
'22': sheen
'23': ta
'24': taa
'25': thaa
'26': thal
'27': toot
'28': waw
'29': ya
'30': yaa
'31': zay
splits:
- name: train
num_bytes: 647055283.152
num_examples: 14202
download_size: 846084553
dataset_size: 647055283.152
---
# Dataset Card for [Dataset Name]
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage: [info]**
- **Repository: [info]**
- **Paper: [info]**
- **Leaderboard: [info]**
- **Point of Contact: [info]**
### Dataset Summary
[More Information Needed]
### Supported Tasks and Leaderboards
[More Information Needed]
### Languages
[More Information Needed]
## Dataset Structure
### Data Instances
[More Information Needed]
### Data Fields
[More Information Needed]
### Data Splits
[More Information Needed]
## Dataset Creation
### Curation Rationale
[More Information Needed]
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed]
#### Who are the source language producers?
[More Information Needed]
### Annotations
#### Annotation process
[More Information Needed]
#### Who are the annotators?
[More Information Needed]
### Personal and Sensitive Information
[More Information Needed]
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed]
### Discussion of Biases
[More Information Needed]
### Other Known Limitations
[More Information Needed]
## Additional Information
### Dataset Curators
[More Information Needed]
### Licensing Information
[More Information Needed]
### Citation Information
[More Information Needed]
### Contributions
Thanks to [@github-username](https://github.com/<github-username>) for adding this dataset. |
316usman/thematic4c_rr | 316usman | "2024-04-07T19:36:36Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T19:18:59Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: document_url
dtype: string
- name: source_url
dtype: string
- name: num_tokens
dtype: int64
splits:
- name: train
num_bytes: 101174683.2760672
num_examples: 158413
download_size: 35845623
dataset_size: 101174683.2760672
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nono647/2000_lieux_livre_eng | nono647 | "2024-04-07T19:19:06Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-07T19:19:06Z" | ---
license: apache-2.0
---
|
gsstein/0-baseline-dataset-llama | gsstein | "2024-04-09T01:58:43Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T19:41:47Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: url
dtype: string
- name: title
dtype: string
- name: summary
dtype: string
- name: text
dtype: string
- name: prompt
dtype: string
- name: generated
dtype: bool
- name: raw_summary
dtype: string
splits:
- name: train
num_bytes: 129465982
num_examples: 15326
- name: test
num_bytes: 4637530
num_examples: 576
- name: validation
num_bytes: 4922569
num_examples: 576
download_size: 85047396
dataset_size: 139026081
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
Az-r-ow/cifar100-custom | Az-r-ow | "2024-04-07T20:40:09Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T20:39:54Z" | ---
dataset_info:
features:
- name: img
struct:
- name: bytes
dtype: binary
- name: path
dtype: 'null'
- name: label
dtype:
class_label:
names:
'0': aquatic_animals
'1': household_furniture
'2': small_objects
'3': insects
'4': land_animals
'5': people
'6': outdoors
'7': vehicles
'8': food
splits:
- name: train
num_bytes: 130788367
num_examples: 57823
download_size: 137828019
dataset_size: 130788367
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
HPGomes/MJTalking | HPGomes | "2024-04-11T18:39:49Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-07T20:42:27Z" | ---
license: openrail
---
|
Ram07/text-data-v0.3 | Ram07 | "2024-04-07T20:55:58Z" | 0 | 0 | [
"license:mit",
"size_categories:10K<n<100K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T20:55:06Z" | ---
license: mit
---
|
open-llm-leaderboard-old/details_voidful__phi-1_5_base | open-llm-leaderboard-old | "2024-04-08T22:13:40Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-07T21:04:11Z" | ---
pretty_name: Evaluation run of voidful/phi-1_5_base
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [voidful/phi-1_5_base](https://huggingface.co/voidful/phi-1_5_base) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_voidful__phi-1_5_base\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-08T22:11:30.215297](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__phi-1_5_base/blob/main/results_2024-04-08T22-11-30.215297.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.39679298771997645,\n\
\ \"acc_stderr\": 0.034186039948922095,\n \"acc_norm\": 0.39895719444857664,\n\
\ \"acc_norm_stderr\": 0.034950858129978286,\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.39710218968722627,\n\
\ \"mc2_stderr\": 0.01475674751424424\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.45819112627986347,\n \"acc_stderr\": 0.014560220308714695,\n\
\ \"acc_norm\": 0.5025597269624573,\n \"acc_norm_stderr\": 0.014611199329843777\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46325433180641307,\n\
\ \"acc_stderr\": 0.004976288321681822,\n \"acc_norm\": 0.6120294761999602,\n\
\ \"acc_norm_stderr\": 0.004862919176408073\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.042561937679014075,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.042561937679014075\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.04017901275981749,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.04017901275981749\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.46,\n\
\ \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n \
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.42641509433962266,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.42641509433962266,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3194444444444444,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.3194444444444444,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.35260115606936415,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.35260115606936415,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.24509803921568626,\n \"acc_stderr\": 0.04280105837364395,\n\
\ \"acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.04280105837364395\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n\
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3446808510638298,\n \"acc_stderr\": 0.03106898596312215,\n\
\ \"acc_norm\": 0.3446808510638298,\n \"acc_norm_stderr\": 0.03106898596312215\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.30158730158730157,\n \"acc_stderr\": 0.023636975996101803,\n \"\
acc_norm\": 0.30158730158730157,\n \"acc_norm_stderr\": 0.023636975996101803\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3870967741935484,\n\
\ \"acc_stderr\": 0.027709359675032495,\n \"acc_norm\": 0.3870967741935484,\n\
\ \"acc_norm_stderr\": 0.027709359675032495\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.28078817733990147,\n \"acc_stderr\": 0.0316185633535861,\n\
\ \"acc_norm\": 0.28078817733990147,\n \"acc_norm_stderr\": 0.0316185633535861\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.03825460278380026,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.03825460278380026\n },\n\
\ \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\": 0.494949494949495,\n\
\ \"acc_stderr\": 0.035621707606254015,\n \"acc_norm\": 0.494949494949495,\n\
\ \"acc_norm_stderr\": 0.035621707606254015\n },\n \"harness|hendrycksTest-high_school_government_and_politics|5\"\
: {\n \"acc\": 0.48704663212435234,\n \"acc_stderr\": 0.0360722806104775,\n\
\ \"acc_norm\": 0.48704663212435234,\n \"acc_norm_stderr\": 0.0360722806104775\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.36153846153846153,\n \"acc_stderr\": 0.024359581465396987,\n\
\ \"acc_norm\": 0.36153846153846153,\n \"acc_norm_stderr\": 0.024359581465396987\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275805,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275805\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3739495798319328,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.3739495798319328,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2913907284768212,\n \"acc_stderr\": 0.03710185726119995,\n \"\
acc_norm\": 0.2913907284768212,\n \"acc_norm_stderr\": 0.03710185726119995\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.47706422018348627,\n \"acc_stderr\": 0.0214147570581755,\n \"\
acc_norm\": 0.47706422018348627,\n \"acc_norm_stderr\": 0.0214147570581755\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.19907407407407407,\n \"acc_stderr\": 0.027232298462690232,\n \"\
acc_norm\": 0.19907407407407407,\n \"acc_norm_stderr\": 0.027232298462690232\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.38235294117647056,\n \"acc_stderr\": 0.034107853389047184,\n \"\
acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.034107853389047184\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.4641350210970464,\n \"acc_stderr\": 0.03246338898055659,\n \
\ \"acc_norm\": 0.4641350210970464,\n \"acc_norm_stderr\": 0.03246338898055659\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.49327354260089684,\n\
\ \"acc_stderr\": 0.033554765962343545,\n \"acc_norm\": 0.49327354260089684,\n\
\ \"acc_norm_stderr\": 0.033554765962343545\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.04537935177947879,\n \"\
acc_norm\": 0.5537190082644629,\n \"acc_norm_stderr\": 0.04537935177947879\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.49074074074074076,\n\
\ \"acc_stderr\": 0.04832853553437055,\n \"acc_norm\": 0.49074074074074076,\n\
\ \"acc_norm_stderr\": 0.04832853553437055\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4294478527607362,\n \"acc_stderr\": 0.038890666191127216,\n\
\ \"acc_norm\": 0.4294478527607362,\n \"acc_norm_stderr\": 0.038890666191127216\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4368932038834951,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.4368932038834951,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5854700854700855,\n\
\ \"acc_stderr\": 0.03227396567623779,\n \"acc_norm\": 0.5854700854700855,\n\
\ \"acc_norm_stderr\": 0.03227396567623779\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.4827586206896552,\n\
\ \"acc_stderr\": 0.017869330154003698,\n \"acc_norm\": 0.4827586206896552,\n\
\ \"acc_norm_stderr\": 0.017869330154003698\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.49421965317919075,\n \"acc_stderr\": 0.02691729617914911,\n\
\ \"acc_norm\": 0.49421965317919075,\n \"acc_norm_stderr\": 0.02691729617914911\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261441,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261441\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.4477124183006536,\n \"acc_stderr\": 0.028472938478033526,\n\
\ \"acc_norm\": 0.4477124183006536,\n \"acc_norm_stderr\": 0.028472938478033526\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.4115755627009646,\n\
\ \"acc_stderr\": 0.027950481494401262,\n \"acc_norm\": 0.4115755627009646,\n\
\ \"acc_norm_stderr\": 0.027950481494401262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4074074074074074,\n \"acc_stderr\": 0.027339546640662727,\n\
\ \"acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.027339546640662727\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3191489361702128,\n \"acc_stderr\": 0.027807990141320203,\n \
\ \"acc_norm\": 0.3191489361702128,\n \"acc_norm_stderr\": 0.027807990141320203\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.30834419817470665,\n\
\ \"acc_stderr\": 0.01179483378971534,\n \"acc_norm\": 0.30834419817470665,\n\
\ \"acc_norm_stderr\": 0.01179483378971534\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22426470588235295,\n \"acc_stderr\": 0.02533684856333238,\n\
\ \"acc_norm\": 0.22426470588235295,\n \"acc_norm_stderr\": 0.02533684856333238\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3611111111111111,\n \"acc_stderr\": 0.01943177567703731,\n \
\ \"acc_norm\": 0.3611111111111111,\n \"acc_norm_stderr\": 0.01943177567703731\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04789131426105757,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04789131426105757\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.031976941187136725,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.031976941187136725\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5771144278606966,\n\
\ \"acc_stderr\": 0.034932317774212816,\n \"acc_norm\": 0.5771144278606966,\n\
\ \"acc_norm_stderr\": 0.034932317774212816\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.42105263157894735,\n \"acc_stderr\": 0.03786720706234215,\n\
\ \"acc_norm\": 0.42105263157894735,\n \"acc_norm_stderr\": 0.03786720706234215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.25703794369645044,\n\
\ \"mc1_stderr\": 0.01529807750948508,\n \"mc2\": 0.39710218968722627,\n\
\ \"mc2_stderr\": 0.01475674751424424\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6937647987371744,\n \"acc_stderr\": 0.01295438597280247\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.1652767247915087,\n \
\ \"acc_stderr\": 0.010231031118582147\n }\n}\n```"
repo_url: https://huggingface.co/voidful/phi-1_5_base
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|arc:challenge|25_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|gsm8k|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hellaswag|10_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T21-02-19.610688.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-11-30.215297.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-08T22-11-30.215297.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- '**/details_harness|winogrande|5_2024-04-07T21-02-19.610688.parquet'
- split: 2024_04_08T22_11_30.215297
path:
- '**/details_harness|winogrande|5_2024-04-08T22-11-30.215297.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-08T22-11-30.215297.parquet'
- config_name: results
data_files:
- split: 2024_04_07T21_02_19.610688
path:
- results_2024-04-07T21-02-19.610688.parquet
- split: 2024_04_08T22_11_30.215297
path:
- results_2024-04-08T22-11-30.215297.parquet
- split: latest
path:
- results_2024-04-08T22-11-30.215297.parquet
---
# Dataset Card for Evaluation run of voidful/phi-1_5_base
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [voidful/phi-1_5_base](https://huggingface.co/voidful/phi-1_5_base) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_voidful__phi-1_5_base",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-08T22:11:30.215297](https://huggingface.co/datasets/open-llm-leaderboard/details_voidful__phi-1_5_base/blob/main/results_2024-04-08T22-11-30.215297.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.39679298771997645,
"acc_stderr": 0.034186039948922095,
"acc_norm": 0.39895719444857664,
"acc_norm_stderr": 0.034950858129978286,
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.39710218968722627,
"mc2_stderr": 0.01475674751424424
},
"harness|arc:challenge|25": {
"acc": 0.45819112627986347,
"acc_stderr": 0.014560220308714695,
"acc_norm": 0.5025597269624573,
"acc_norm_stderr": 0.014611199329843777
},
"harness|hellaswag|10": {
"acc": 0.46325433180641307,
"acc_stderr": 0.004976288321681822,
"acc_norm": 0.6120294761999602,
"acc_norm_stderr": 0.004862919176408073
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.042561937679014075,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.042561937679014075
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.04017901275981749,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.04017901275981749
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.42641509433962266,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.42641509433962266,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.35260115606936415,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.35260115606936415,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.04280105837364395,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.04280105837364395
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3446808510638298,
"acc_stderr": 0.03106898596312215,
"acc_norm": 0.3446808510638298,
"acc_norm_stderr": 0.03106898596312215
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.30158730158730157,
"acc_stderr": 0.023636975996101803,
"acc_norm": 0.30158730158730157,
"acc_norm_stderr": 0.023636975996101803
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3870967741935484,
"acc_stderr": 0.027709359675032495,
"acc_norm": 0.3870967741935484,
"acc_norm_stderr": 0.027709359675032495
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.28078817733990147,
"acc_stderr": 0.0316185633535861,
"acc_norm": 0.28078817733990147,
"acc_norm_stderr": 0.0316185633535861
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4,
"acc_stderr": 0.03825460278380026,
"acc_norm": 0.4,
"acc_norm_stderr": 0.03825460278380026
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.494949494949495,
"acc_stderr": 0.035621707606254015,
"acc_norm": 0.494949494949495,
"acc_norm_stderr": 0.035621707606254015
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.48704663212435234,
"acc_stderr": 0.0360722806104775,
"acc_norm": 0.48704663212435234,
"acc_norm_stderr": 0.0360722806104775
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.36153846153846153,
"acc_stderr": 0.024359581465396987,
"acc_norm": 0.36153846153846153,
"acc_norm_stderr": 0.024359581465396987
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275805,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275805
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3739495798319328,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.3739495798319328,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2913907284768212,
"acc_stderr": 0.03710185726119995,
"acc_norm": 0.2913907284768212,
"acc_norm_stderr": 0.03710185726119995
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.47706422018348627,
"acc_stderr": 0.0214147570581755,
"acc_norm": 0.47706422018348627,
"acc_norm_stderr": 0.0214147570581755
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.19907407407407407,
"acc_stderr": 0.027232298462690232,
"acc_norm": 0.19907407407407407,
"acc_norm_stderr": 0.027232298462690232
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.034107853389047184,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.034107853389047184
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.4641350210970464,
"acc_stderr": 0.03246338898055659,
"acc_norm": 0.4641350210970464,
"acc_norm_stderr": 0.03246338898055659
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.49327354260089684,
"acc_stderr": 0.033554765962343545,
"acc_norm": 0.49327354260089684,
"acc_norm_stderr": 0.033554765962343545
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.04537935177947879,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.04537935177947879
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.04832853553437055,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.04832853553437055
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4294478527607362,
"acc_stderr": 0.038890666191127216,
"acc_norm": 0.4294478527607362,
"acc_norm_stderr": 0.038890666191127216
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.4368932038834951,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.4368932038834951,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5854700854700855,
"acc_stderr": 0.03227396567623779,
"acc_norm": 0.5854700854700855,
"acc_norm_stderr": 0.03227396567623779
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.017869330154003698,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.017869330154003698
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.49421965317919075,
"acc_stderr": 0.02691729617914911,
"acc_norm": 0.49421965317919075,
"acc_norm_stderr": 0.02691729617914911
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261441,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261441
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.4477124183006536,
"acc_stderr": 0.028472938478033526,
"acc_norm": 0.4477124183006536,
"acc_norm_stderr": 0.028472938478033526
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.4115755627009646,
"acc_stderr": 0.027950481494401262,
"acc_norm": 0.4115755627009646,
"acc_norm_stderr": 0.027950481494401262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.027339546640662727,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.027339546640662727
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3191489361702128,
"acc_stderr": 0.027807990141320203,
"acc_norm": 0.3191489361702128,
"acc_norm_stderr": 0.027807990141320203
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.30834419817470665,
"acc_stderr": 0.01179483378971534,
"acc_norm": 0.30834419817470665,
"acc_norm_stderr": 0.01179483378971534
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22426470588235295,
"acc_stderr": 0.02533684856333238,
"acc_norm": 0.22426470588235295,
"acc_norm_stderr": 0.02533684856333238
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3611111111111111,
"acc_stderr": 0.01943177567703731,
"acc_norm": 0.3611111111111111,
"acc_norm_stderr": 0.01943177567703731
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5,
"acc_stderr": 0.04789131426105757,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04789131426105757
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.031976941187136725,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.031976941187136725
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5771144278606966,
"acc_stderr": 0.034932317774212816,
"acc_norm": 0.5771144278606966,
"acc_norm_stderr": 0.034932317774212816
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.03786720706234215,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.03786720706234215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.25703794369645044,
"mc1_stderr": 0.01529807750948508,
"mc2": 0.39710218968722627,
"mc2_stderr": 0.01475674751424424
},
"harness|winogrande|5": {
"acc": 0.6937647987371744,
"acc_stderr": 0.01295438597280247
},
"harness|gsm8k|5": {
"acc": 0.1652767247915087,
"acc_stderr": 0.010231031118582147
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ali-alkhars/job-search | ali-alkhars | "2024-04-07T21:10:51Z" | 0 | 0 | [
"language:en",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"jobs"
] | null | "2024-04-07T21:06:36Z" | ---
language:
- en
tags:
- jobs
pretty_name: Job Search Dataset
size_categories:
- n<1K
---
This dataset is used to train LMs to generate a very specific key \[JOBSEARCH\] when the input is asking for real-time job offers.
### Dataset Sources
- ChatGPT-4
- Gemini |
MAdAiLab/lex_glue_scotus | MAdAiLab | "2024-04-07T21:16:44Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T21:16:34Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': '1'
'1': '2'
'2': '3'
'3': '4'
'4': '5'
'5': '6'
'6': '7'
'7': '8'
'8': '9'
'9': '10'
'10': '11'
'11': '12'
'12': '13'
splits:
- name: train
num_bytes: 178959316
num_examples: 5000
- name: test
num_bytes: 76213279
num_examples: 1400
- name: validation
num_bytes: 75600243
num_examples: 1400
download_size: 173411381
dataset_size: 330772838
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
MAdAiLab/lex_glue_ledgar | MAdAiLab | "2024-04-07T21:18:11Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T21:18:09Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Adjustments
'1': Agreements
'2': Amendments
'3': Anti-Corruption Laws
'4': Applicable Laws
'5': Approvals
'6': Arbitration
'7': Assignments
'8': Assigns
'9': Authority
'10': Authorizations
'11': Base Salary
'12': Benefits
'13': Binding Effects
'14': Books
'15': Brokers
'16': Capitalization
'17': Change In Control
'18': Closings
'19': Compliance With Laws
'20': Confidentiality
'21': Consent To Jurisdiction
'22': Consents
'23': Construction
'24': Cooperation
'25': Costs
'26': Counterparts
'27': Death
'28': Defined Terms
'29': Definitions
'30': Disability
'31': Disclosures
'32': Duties
'33': Effective Dates
'34': Effectiveness
'35': Employment
'36': Enforceability
'37': Enforcements
'38': Entire Agreements
'39': Erisa
'40': Existence
'41': Expenses
'42': Fees
'43': Financial Statements
'44': Forfeitures
'45': Further Assurances
'46': General
'47': Governing Laws
'48': Headings
'49': Indemnifications
'50': Indemnity
'51': Insurances
'52': Integration
'53': Intellectual Property
'54': Interests
'55': Interpretations
'56': Jurisdictions
'57': Liens
'58': Litigations
'59': Miscellaneous
'60': Modifications
'61': No Conflicts
'62': No Defaults
'63': No Waivers
'64': Non-Disparagement
'65': Notices
'66': Organizations
'67': Participations
'68': Payments
'69': Positions
'70': Powers
'71': Publicity
'72': Qualifications
'73': Records
'74': Releases
'75': Remedies
'76': Representations
'77': Sales
'78': Sanctions
'79': Severability
'80': Solvency
'81': Specific Performance
'82': Submission To Jurisdiction
'83': Subsidiaries
'84': Successors
'85': Survival
'86': Tax Withholdings
'87': Taxes
'88': Terminations
'89': Terms
'90': Titles
'91': Transactions With Affiliates
'92': Use Of Proceeds
'93': Vacations
'94': Venues
'95': Vesting
'96': Waiver Of Jury Trials
'97': Waivers
'98': Warranties
'99': Withholdings
splits:
- name: train
num_bytes: 43358291
num_examples: 60000
- name: test
num_bytes: 6845581
num_examples: 10000
- name: validation
num_bytes: 7143588
num_examples: 10000
download_size: 27650585
dataset_size: 57347460
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: validation
path: data/validation-*
---
|
MAdAiLab/patent_classification | MAdAiLab | "2024-04-07T21:18:49Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T21:18:47Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype:
class_label:
names:
'0': Human Necessities
'1': Performing Operations; Transporting
'2': Chemistry; Metallurgy
'3': Textiles; Paper
'4': Fixed Constructions
'5': Mechanical Engineering; Lightning; Heating; Weapons; Blasting
'6': Physics
'7': Electricity
'8': General tagging of new or cross-sectional technology
splits:
- name: train
num_bytes: 17225101
num_examples: 25000
- name: validation
num_bytes: 3472854
num_examples: 5000
- name: test
num_bytes: 3456733
num_examples: 5000
download_size: 12067953
dataset_size: 24154688
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
amaniabuzaid/Amani-ZU | amaniabuzaid | "2024-04-07T22:19:58Z" | 0 | 1 | [
"task_categories:feature-extraction",
"language:en",
"license:afl-3.0",
"size_categories:10K<n<100K",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"region:us",
"not-for-all-audiences"
] | [
"feature-extraction"
] | "2024-04-07T21:20:03Z" | ---
license: afl-3.0
task_categories:
- feature-extraction
language:
- en
tags:
- not-for-all-audiences
pretty_name: RealEstate
size_categories:
- n<1K
--- |
ali-alkhars/job-search-activation-eval | ali-alkhars | "2024-04-07T21:38:23Z" | 0 | 0 | [
"language:en",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"evaluation"
] | null | "2024-04-07T21:30:54Z" | ---
language:
- en
tags:
- evaluation
pretty_name: Job-search Activation Evaluation Dataset
size_categories:
- n<1K
---
This dataset is used to evaluate CareerBud on the job-search activation task. The entries are divided into:
- Activation prompts (prompts that should produce the signal \[JOBSEARCH\])
- Non-activation prompts (prompts that shouldn't produce the signal) |
ai-aerospace/ams_data_full_2000-2020 | ai-aerospace | "2024-04-07T21:52:43Z" | 0 | 0 | [
"task_categories:question-answering",
"task_categories:summarization",
"language:en",
"license:mit",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"question-answering",
"summarization"
] | "2024-04-07T21:41:14Z" | ---
license: mit
task_categories:
- question-answering
- summarization
language:
- en
---
Aerospace Mechanism Symposia PDF documents parsed by page. All symposia documents from the year 2000-2022 are included. No splitting was used.
Original documents here: https://github.com/dan-s-mueller/aerospace_chatbot/tree/main/data/AMS |
mxronga/sportsinyoruba | mxronga | "2024-04-07T22:38:11Z" | 0 | 0 | [
"language:yo",
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"pretrain "
] | null | "2024-04-07T22:03:26Z" | ---
license: apache-2.0
language:
- yo
tags:
- 'pretrain '
---
https://sportsinyoruba.wordpress.com |
HL121/stat453_dataset | HL121 | "2024-04-11T02:46:19Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T23:15:52Z" | ---
dataset_info:
features:
- name: source_img
dtype: image
- name: instruction
dtype: string
- name: target_img
dtype: image
splits:
- name: train
num_bytes: 678834941.8
num_examples: 2800
download_size: 695006048
dataset_size: 678834941.8
---
# Dataset Card for "stat453_dataset"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
not-lain/neirez | not-lain | "2024-04-08T00:25:41Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-07T23:27:02Z" | ---
dataset_info:
features:
- name: pk
dtype: int64
- name: username
dtype: string
- name: full_name
dtype: string
- name: is_private
dtype: bool
- name: profile_pic_url
dtype: string
- name: profile_pic_url_hd
dtype: string
- name: is_verified
dtype: bool
- name: media_count
dtype: int64
- name: follower_count
dtype: int64
- name: following_count
dtype: int64
- name: biography
dtype: string
- name: bio_links
dtype: string
- name: external_url
dtype: string
- name: account_type
dtype: int64
- name: is_business
dtype: bool
- name: public_email
dtype: string
- name: contact_phone_number
dtype: string
- name: public_phone_country_code
dtype: string
- name: public_phone_number
dtype: string
- name: business_contact_method
dtype: string
- name: business_category_name
dtype: string
- name: category_name
dtype: string
- name: category
dtype: string
- name: address_street
dtype: string
- name: city_id
dtype: string
- name: city_name
dtype: string
- name: latitude
dtype: string
- name: longitude
dtype: string
- name: zip
dtype: string
- name: instagram_location_id
dtype: string
- name: interop_messaging_user_fbid
dtype: string
- name: information
dtype: string
- name: embedding
sequence: float32
splits:
- name: train
num_bytes: 6497049
num_examples: 889
download_size: 5128161
dataset_size: 6497049
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
arsalanaa/oilpainting_training_768X768_15 | arsalanaa | "2024-04-07T23:39:38Z" | 0 | 0 | [
"license:unknown",
"size_categories:n<1K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-07T23:38:59Z" | ---
license: unknown
---
|