datasetId
stringlengths 5
121
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
19.3M
| likes
int64 0
5.14k
| tags
sequencelengths 1
7.92k
| task_categories
sequencelengths 0
40
⌀ | createdAt
unknown | card
stringlengths 19
977k
|
---|---|---|---|---|---|---|---|---|
open-llm-leaderboard-old/details_google__recurrentgemma-2b | open-llm-leaderboard-old | "2024-04-26T23:48:20Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T00:17:07Z" | ---
pretty_name: Evaluation run of google/recurrentgemma-2b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [google/recurrentgemma-2b](https://huggingface.co/google/recurrentgemma-2b) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 10 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_google__recurrentgemma-2b\"\
,\n\t\"harness_hellaswag_10\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-26T23:47:43.035821](https://huggingface.co/datasets/open-llm-leaderboard/details_google__recurrentgemma-2b/blob/main/results_2024-04-26T23-47-43.035821.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.534654451304521,\n\
\ \"acc_stderr\": 0.004977782217582457,\n \"acc_norm\": 0.7247560246962756,\n\
\ \"acc_norm_stderr\": 0.004457243336616505\n },\n \"harness|hellaswag|10\"\
: {\n \"acc\": 0.534654451304521,\n \"acc_stderr\": 0.004977782217582457,\n\
\ \"acc_norm\": 0.7247560246962756,\n \"acc_norm_stderr\": 0.004457243336616505\n\
\ }\n}\n```"
repo_url: https://huggingface.co/google/recurrentgemma-2b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|arc:challenge|25_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|arc:challenge|25_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|arc:challenge|25_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|arc:challenge|25_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|arc:challenge|25_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T14_23_18.466657
path:
- '**/details_harness|arc:challenge|25_2024-04-26T14-23-18.466657.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|arc:challenge|25_2024-04-26T21-09-47.796009.parquet'
- split: 2024_04_26T22_24_04.381286
path:
- '**/details_harness|arc:challenge|25_2024-04-26T22-24-04.381286.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-26T22-24-04.381286.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|gsm8k|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|gsm8k|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|gsm8k|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|gsm8k|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|gsm8k|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|gsm8k|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hellaswag|10_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hellaswag|10_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hellaswag|10_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hellaswag|10_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hellaswag|10_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T16_12_03.160717
path:
- '**/details_harness|hellaswag|10_2024-04-26T16-12-03.160717.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hellaswag|10_2024-04-26T21-09-47.796009.parquet'
- split: 2024_04_26T23_47_43.035821
path:
- '**/details_harness|hellaswag|10_2024-04-26T23-47-43.035821.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-26T23-47-43.035821.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-15-01.985047.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-53-48.304390.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T03-06-15.036120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-24T17-32-34.581100.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-25T15-46-23.348779.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-26T21-09-47.796009.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-26T21-09-47.796009.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- '**/details_harness|winogrande|5_2024-04-09T00-15-01.985047.parquet'
- split: 2024_04_09T00_53_48.304390
path:
- '**/details_harness|winogrande|5_2024-04-09T00-53-48.304390.parquet'
- split: 2024_04_09T03_06_15.036120
path:
- '**/details_harness|winogrande|5_2024-04-09T03-06-15.036120.parquet'
- split: 2024_04_24T17_32_34.581100
path:
- '**/details_harness|winogrande|5_2024-04-24T17-32-34.581100.parquet'
- split: 2024_04_25T15_46_23.348779
path:
- '**/details_harness|winogrande|5_2024-04-25T15-46-23.348779.parquet'
- split: 2024_04_26T21_09_47.796009
path:
- '**/details_harness|winogrande|5_2024-04-26T21-09-47.796009.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-26T21-09-47.796009.parquet'
- config_name: results
data_files:
- split: 2024_04_09T00_15_01.985047
path:
- results_2024-04-09T00-15-01.985047.parquet
- split: 2024_04_09T00_53_48.304390
path:
- results_2024-04-09T00-53-48.304390.parquet
- split: 2024_04_09T03_06_15.036120
path:
- results_2024-04-09T03-06-15.036120.parquet
- split: 2024_04_24T17_32_34.581100
path:
- results_2024-04-24T17-32-34.581100.parquet
- split: 2024_04_25T15_46_23.348779
path:
- results_2024-04-25T15-46-23.348779.parquet
- split: 2024_04_26T14_23_18.466657
path:
- results_2024-04-26T14-23-18.466657.parquet
- split: 2024_04_26T16_12_03.160717
path:
- results_2024-04-26T16-12-03.160717.parquet
- split: 2024_04_26T21_09_47.796009
path:
- results_2024-04-26T21-09-47.796009.parquet
- split: 2024_04_26T22_24_04.381286
path:
- results_2024-04-26T22-24-04.381286.parquet
- split: 2024_04_26T23_47_43.035821
path:
- results_2024-04-26T23-47-43.035821.parquet
- split: latest
path:
- results_2024-04-26T23-47-43.035821.parquet
---
# Dataset Card for Evaluation run of google/recurrentgemma-2b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [google/recurrentgemma-2b](https://huggingface.co/google/recurrentgemma-2b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 10 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_google__recurrentgemma-2b",
"harness_hellaswag_10",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-26T23:47:43.035821](https://huggingface.co/datasets/open-llm-leaderboard/details_google__recurrentgemma-2b/blob/main/results_2024-04-26T23-47-43.035821.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.534654451304521,
"acc_stderr": 0.004977782217582457,
"acc_norm": 0.7247560246962756,
"acc_norm_stderr": 0.004457243336616505
},
"harness|hellaswag|10": {
"acc": 0.534654451304521,
"acc_stderr": 0.004977782217582457,
"acc_norm": 0.7247560246962756,
"acc_norm_stderr": 0.004457243336616505
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B | open-llm-leaderboard-old | "2024-04-09T00:22:45Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T00:22:13Z" | ---
pretty_name: Evaluation run of nbeerbower/slerp-bophades-truthy-math-mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/slerp-bophades-truthy-math-mistral-7B](https://huggingface.co/nbeerbower/slerp-bophades-truthy-math-mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T00:19:46.142948](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B/blob/main/results_2024-04-09T00-19-46.142948.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533577677120704,\n\
\ \"acc_stderr\": 0.0321090974841392,\n \"acc_norm\": 0.6524581392335448,\n\
\ \"acc_norm_stderr\": 0.032786891825831214,\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7782437262946236,\n\
\ \"mc2_stderr\": 0.0137879523668123\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7141638225255973,\n \"acc_stderr\": 0.013203196088537372,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7197769368651663,\n\
\ \"acc_stderr\": 0.004481902637505652,\n \"acc_norm\": 0.8916550487950607,\n\
\ \"acc_norm_stderr\": 0.0031018035745563107\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.0253795249107784,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.0253795249107784\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159463,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159463\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.01362555690799347,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.01362555690799347\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43798882681564244,\n\
\ \"acc_stderr\": 0.016593394227564843,\n \"acc_norm\": 0.43798882681564244,\n\
\ \"acc_norm_stderr\": 0.016593394227564843\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n\
\ \"acc_stderr\": 0.012758410941038913,\n \"acc_norm\": 0.4784876140808344,\n\
\ \"acc_norm_stderr\": 0.012758410941038913\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.02812342933514278,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.02812342933514278\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6242350061199511,\n\
\ \"mc1_stderr\": 0.01695458406021429,\n \"mc2\": 0.7782437262946236,\n\
\ \"mc2_stderr\": 0.0137879523668123\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8539857932123125,\n \"acc_stderr\": 0.009924440374585244\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.012714401009923647\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/slerp-bophades-truthy-math-mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|arc:challenge|25_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|gsm8k|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hellaswag|10_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T00-19-46.142948.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- '**/details_harness|winogrande|5_2024-04-09T00-19-46.142948.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T00-19-46.142948.parquet'
- config_name: results
data_files:
- split: 2024_04_09T00_19_46.142948
path:
- results_2024-04-09T00-19-46.142948.parquet
- split: latest
path:
- results_2024-04-09T00-19-46.142948.parquet
---
# Dataset Card for Evaluation run of nbeerbower/slerp-bophades-truthy-math-mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/slerp-bophades-truthy-math-mistral-7B](https://huggingface.co/nbeerbower/slerp-bophades-truthy-math-mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T00:19:46.142948](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__slerp-bophades-truthy-math-mistral-7B/blob/main/results_2024-04-09T00-19-46.142948.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533577677120704,
"acc_stderr": 0.0321090974841392,
"acc_norm": 0.6524581392335448,
"acc_norm_stderr": 0.032786891825831214,
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7782437262946236,
"mc2_stderr": 0.0137879523668123
},
"harness|arc:challenge|25": {
"acc": 0.7141638225255973,
"acc_stderr": 0.013203196088537372,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7197769368651663,
"acc_stderr": 0.004481902637505652,
"acc_norm": 0.8916550487950607,
"acc_norm_stderr": 0.0031018035745563107
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.0253795249107784,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.0253795249107784
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159463,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159463
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.01362555690799347,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.01362555690799347
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43798882681564244,
"acc_stderr": 0.016593394227564843,
"acc_norm": 0.43798882681564244,
"acc_norm_stderr": 0.016593394227564843
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038913,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038913
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.02812342933514278,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.02812342933514278
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6242350061199511,
"mc1_stderr": 0.01695458406021429,
"mc2": 0.7782437262946236,
"mc2_stderr": 0.0137879523668123
},
"harness|winogrande|5": {
"acc": 0.8539857932123125,
"acc_stderr": 0.009924440374585244
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.012714401009923647
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gingercake01/stt0409 | gingercake01 | "2024-04-09T02:22:04Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-09T01:28:00Z" | ---
license: mit
dataset_info:
features:
- name: input_features
sequence:
sequence: float32
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 2251140096
num_examples: 2344
- name: test
num_bytes: 281392512
num_examples: 293
- name: valid
num_bytes: 281392512
num_examples: 293
download_size: 446723855
dataset_size: 2813925120
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
- split: valid
path: data/valid-*
---
|
nlp-brin-id/unsup-title-content | nlp-brin-id | "2024-04-09T02:13:21Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:10K<n<100K",
"format:csv",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T01:49:36Z" | ---
license: apache-2.0
---
|
hqfx/fc_sample | hqfx | "2024-04-09T07:18:02Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T02:31:37Z" | ---
dataset_info:
features:
- name: functions
dtype: string
- name: conversation
list:
- name: content
dtype: string
- name: function_call
struct:
- name: arguments
dtype: string
- name: name
dtype: string
- name: name
dtype: string
- name: role
dtype: string
splits:
- name: zh_easy_v1
num_bytes: 15168.989180972818
num_examples: 10
- name: zh_easy_v2
num_bytes: 55189.360492657506
num_examples: 10
- name: en_hard
num_bytes: 12585.883890024994
num_examples: 10
- name: en_react
num_bytes: 126288.2458364296
num_examples: 20
- name: zh_hard
num_bytes: 117715.8407079646
num_examples: 10
- name: zh_agent
num_bytes: 60719.32730923695
num_examples: 10
download_size: 209654
dataset_size: 387667.6474172865
configs:
- config_name: default
data_files:
- split: zh_easy_v1
path: data/zh_easy_v1-*
- split: zh_easy_v2
path: data/zh_easy_v2-*
- split: en_hard
path: data/en_hard-*
- split: en_react
path: data/en_react-*
- split: zh_hard
path: data/zh_hard-*
- split: zh_agent
path: data/zh_agent-*
---
|
KagglingFace/FYP-KiTS-A-Trimmed-Preprocess-Colab | KagglingFace | "2024-04-09T02:40:20Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-09T02:40:19Z" | ---
license: mit
---
|
Ediudo/colmanetti | Ediudo | "2024-04-09T03:30:07Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T03:29:49Z" | ---
license: openrail
---
|
iamnguyen/law_qa | iamnguyen | "2024-04-09T03:45:31Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T03:45:13Z" | ---
dataset_info:
features:
- name: title
dtype: string
- name: question
dtype: string
- name: content
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1457128742
num_examples: 206254
download_size: 432874652
dataset_size: 1457128742
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ahmedgongi/version_09_04 | ahmedgongi | "2024-04-09T03:57:09Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-09T03:57:09Z" | ---
license: apache-2.0
---
|
iamnguyen/law_qa_chat | iamnguyen | "2024-04-09T04:02:23Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T03:57:28Z" | ---
dataset_info:
features:
- name: title
dtype: string
- name: question
dtype: string
- name: content
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 1311411628.962231
num_examples: 185628
- name: test
num_bytes: 145717113.03776896
num_examples: 20626
download_size: 520762075
dataset_size: 1457128742.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
DopeorNope/slim_orca | DopeorNope | "2024-04-09T04:27:51Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T04:26:24Z" | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
- name: num
dtype: int64
splits:
- name: train
num_bytes: 900655397
num_examples: 517982
download_size: 485203597
dataset_size: 900655397
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
fizzfzzf/space | fizzfzzf | "2024-04-09T04:33:39Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-09T04:31:24Z" | ---
license: mit
---
|
dutta18/omcs_commonsense_corpus1.5M_for_fast_NN_search | dutta18 | "2024-04-09T04:40:25Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T04:35:38Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: embeddings
sequence: float64
splits:
- name: train
num_bytes: 4936612764
num_examples: 1578238
download_size: 4250048639
dataset_size: 4936612764
---
# Dataset Card for "omcs_commonsense_corpus1.5M_for_fast_NN_search"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
blablablanco/fakebb_evaluation | blablablanco | "2024-04-09T05:38:45Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:image",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T05:36:48Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Enhanced
'1': Natural
splits:
- name: train
num_bytes: 233969147.588
num_examples: 24484
- name: validation
num_bytes: 13296431.16
num_examples: 1360
- name: test
num_bytes: 13048059.197
num_examples: 1361
download_size: 268420934
dataset_size: 260313637.945
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Piyush2512/CREMA-mel-spectrogram-images-preprocessed | Piyush2512 | "2024-04-11T07:55:19Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T05:45:02Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Anger
'1': Happy
'2': Fear
'3': Sad
'4': Disgust
'5': Neutral
- name: pixel_values
sequence:
sequence:
sequence: float32
splits:
- name: train
num_bytes: 4875082092.75
num_examples: 7442
download_size: 993636094
dataset_size: 4875082092.75
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alkav/customerfeedbacks | alkav | "2024-04-09T05:45:55Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T05:45:38Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 110026
num_examples: 100
download_size: 20789
dataset_size: 110026
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ppoliver/deat | ppoliver | "2024-04-10T02:59:44Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-09T06:11:01Z" | ---
license: mit
---
|
man4j/ada_v3 | man4j | "2024-04-09T06:27:14Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T06:23:30Z" | ---
dataset_info:
features:
- name: instruct
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 235764.0
num_examples: 169
download_size: 41722
dataset_size: 235764.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
gagan3012/oasis | gagan3012 | "2024-04-09T06:34:17Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"doi:10.57967/hf/2029",
"region:us"
] | null | "2024-04-09T06:26:29Z" | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 6962220398
num_examples: 2344376
download_size: 3410521074
dataset_size: 6962220398
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nlp-brin-id/unsup-title-fact | nlp-brin-id | "2024-04-09T11:12:01Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:10K<n<100K",
"format:csv",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T06:37:50Z" | ---
license: apache-2.0
---
|
open-llm-leaderboard-old/details_lemon-mint__gemma-ko-7b-instruct-v0.71 | open-llm-leaderboard-old | "2024-04-09T06:40:00Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T06:39:37Z" | ---
pretty_name: Evaluation run of lemon-mint/gemma-ko-7b-instruct-v0.71
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lemon-mint/gemma-ko-7b-instruct-v0.71](https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.71)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T06:37:34.656936](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71/blob/main/results_2024-04-09T06-37-34.656936.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5942899564374612,\n\
\ \"acc_stderr\": 0.03331686070230768,\n \"acc_norm\": 0.5993757855966777,\n\
\ \"acc_norm_stderr\": 0.03398353543840741,\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.5172148516529776,\n\
\ \"mc2_stderr\": 0.015636005438812176\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5093856655290102,\n \"acc_stderr\": 0.014608816322065,\n\
\ \"acc_norm\": 0.5503412969283277,\n \"acc_norm_stderr\": 0.01453714444428473\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5894244174467238,\n\
\ \"acc_stderr\": 0.004909328992915072,\n \"acc_norm\": 0.7746464847639912,\n\
\ \"acc_norm_stderr\": 0.004169610254807967\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.660377358490566,\n \"acc_stderr\": 0.029146904747798335,\n\
\ \"acc_norm\": 0.660377358490566,\n \"acc_norm_stderr\": 0.029146904747798335\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.037161774375660164,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.037161774375660164\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145634,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145634\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5317919075144508,\n\
\ \"acc_stderr\": 0.03804749744364764,\n \"acc_norm\": 0.5317919075144508,\n\
\ \"acc_norm_stderr\": 0.03804749744364764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.046774730044911984,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.046774730044911984\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.025525034382474894,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.025525034382474894\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.04444444444444449,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.04444444444444449\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.025649381063029268,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.025649381063029268\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593552,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593552\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5923076923076923,\n \"acc_stderr\": 0.024915243985987847,\n\
\ \"acc_norm\": 0.5923076923076923,\n \"acc_norm_stderr\": 0.024915243985987847\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.29259259259259257,\n \"acc_stderr\": 0.027738969632176088,\n \
\ \"acc_norm\": 0.29259259259259257,\n \"acc_norm_stderr\": 0.027738969632176088\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.03077805742293167,\n \
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.03077805742293167\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8036697247706422,\n \"acc_stderr\": 0.01703071933915435,\n \"\
acc_norm\": 0.8036697247706422,\n \"acc_norm_stderr\": 0.01703071933915435\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7107843137254902,\n \"acc_stderr\": 0.031822318676475524,\n \"\
acc_norm\": 0.7107843137254902,\n \"acc_norm_stderr\": 0.031822318676475524\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.03749492448709697,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.03749492448709697\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6809815950920245,\n \"acc_stderr\": 0.03661997551073836,\n\
\ \"acc_norm\": 0.6809815950920245,\n \"acc_norm_stderr\": 0.03661997551073836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5089285714285714,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.5089285714285714,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.0398913985953177,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.0398913985953177\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560403,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7713920817369093,\n\
\ \"acc_stderr\": 0.015016884698539892,\n \"acc_norm\": 0.7713920817369093,\n\
\ \"acc_norm_stderr\": 0.015016884698539892\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6358381502890174,\n \"acc_stderr\": 0.025906632631016127,\n\
\ \"acc_norm\": 0.6358381502890174,\n \"acc_norm_stderr\": 0.025906632631016127\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369918,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369918\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6209150326797386,\n \"acc_stderr\": 0.02778014120702335,\n\
\ \"acc_norm\": 0.6209150326797386,\n \"acc_norm_stderr\": 0.02778014120702335\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.026229649178821157,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.026229649178821157\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4556714471968709,\n\
\ \"acc_stderr\": 0.012719949543032205,\n \"acc_norm\": 0.4556714471968709,\n\
\ \"acc_norm_stderr\": 0.012719949543032205\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5551470588235294,\n \"acc_stderr\": 0.030187532060329383,\n\
\ \"acc_norm\": 0.5551470588235294,\n \"acc_norm_stderr\": 0.030187532060329383\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5947712418300654,\n \"acc_stderr\": 0.019861155193829156,\n \
\ \"acc_norm\": 0.5947712418300654,\n \"acc_norm_stderr\": 0.019861155193829156\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910507,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910507\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065684,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065684\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017193,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017193\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.463855421686747,\n\
\ \"acc_stderr\": 0.03882310850890594,\n \"acc_norm\": 0.463855421686747,\n\
\ \"acc_norm_stderr\": 0.03882310850890594\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.03094445977853321,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.03094445977853321\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.33414932680538556,\n\
\ \"mc1_stderr\": 0.016512530677150538,\n \"mc2\": 0.5172148516529776,\n\
\ \"mc2_stderr\": 0.015636005438812176\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6977111286503551,\n \"acc_stderr\": 0.012907200361627532\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.41698256254738436,\n \
\ \"acc_stderr\": 0.013581320997216591\n }\n}\n```"
repo_url: https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.71
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-37-34.656936.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- '**/details_harness|winogrande|5_2024-04-09T06-37-34.656936.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T06-37-34.656936.parquet'
- config_name: results
data_files:
- split: 2024_04_09T06_37_34.656936
path:
- results_2024-04-09T06-37-34.656936.parquet
- split: latest
path:
- results_2024-04-09T06-37-34.656936.parquet
---
# Dataset Card for Evaluation run of lemon-mint/gemma-ko-7b-instruct-v0.71
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lemon-mint/gemma-ko-7b-instruct-v0.71](https://huggingface.co/lemon-mint/gemma-ko-7b-instruct-v0.71) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T06:37:34.656936](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-ko-7b-instruct-v0.71/blob/main/results_2024-04-09T06-37-34.656936.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5942899564374612,
"acc_stderr": 0.03331686070230768,
"acc_norm": 0.5993757855966777,
"acc_norm_stderr": 0.03398353543840741,
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.5172148516529776,
"mc2_stderr": 0.015636005438812176
},
"harness|arc:challenge|25": {
"acc": 0.5093856655290102,
"acc_stderr": 0.014608816322065,
"acc_norm": 0.5503412969283277,
"acc_norm_stderr": 0.01453714444428473
},
"harness|hellaswag|10": {
"acc": 0.5894244174467238,
"acc_stderr": 0.004909328992915072,
"acc_norm": 0.7746464847639912,
"acc_norm_stderr": 0.004169610254807967
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.660377358490566,
"acc_stderr": 0.029146904747798335,
"acc_norm": 0.660377358490566,
"acc_norm_stderr": 0.029146904747798335
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.037161774375660164,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.037161774375660164
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145634,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145634
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5317919075144508,
"acc_stderr": 0.03804749744364764,
"acc_norm": 0.5317919075144508,
"acc_norm_stderr": 0.03804749744364764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.046774730044911984,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.046774730044911984
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.025525034382474894,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.025525034382474894
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.04444444444444449,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.04444444444444449
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.025649381063029268,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.025649381063029268
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593552,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593552
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5923076923076923,
"acc_stderr": 0.024915243985987847,
"acc_norm": 0.5923076923076923,
"acc_norm_stderr": 0.024915243985987847
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.29259259259259257,
"acc_stderr": 0.027738969632176088,
"acc_norm": 0.29259259259259257,
"acc_norm_stderr": 0.027738969632176088
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.03077805742293167,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.03077805742293167
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915435,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915435
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7107843137254902,
"acc_stderr": 0.031822318676475524,
"acc_norm": 0.7107843137254902,
"acc_norm_stderr": 0.031822318676475524
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.03749492448709697,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.03749492448709697
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6809815950920245,
"acc_stderr": 0.03661997551073836,
"acc_norm": 0.6809815950920245,
"acc_norm_stderr": 0.03661997551073836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5089285714285714,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.5089285714285714,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.0398913985953177,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.0398913985953177
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560403,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7713920817369093,
"acc_stderr": 0.015016884698539892,
"acc_norm": 0.7713920817369093,
"acc_norm_stderr": 0.015016884698539892
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6358381502890174,
"acc_stderr": 0.025906632631016127,
"acc_norm": 0.6358381502890174,
"acc_norm_stderr": 0.025906632631016127
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369918,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369918
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6209150326797386,
"acc_stderr": 0.02778014120702335,
"acc_norm": 0.6209150326797386,
"acc_norm_stderr": 0.02778014120702335
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630998,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.026229649178821157,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.026229649178821157
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4556714471968709,
"acc_stderr": 0.012719949543032205,
"acc_norm": 0.4556714471968709,
"acc_norm_stderr": 0.012719949543032205
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5551470588235294,
"acc_stderr": 0.030187532060329383,
"acc_norm": 0.5551470588235294,
"acc_norm_stderr": 0.030187532060329383
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5947712418300654,
"acc_stderr": 0.019861155193829156,
"acc_norm": 0.5947712418300654,
"acc_norm_stderr": 0.019861155193829156
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910507,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910507
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065684,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065684
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017193,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017193
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-virology|5": {
"acc": 0.463855421686747,
"acc_stderr": 0.03882310850890594,
"acc_norm": 0.463855421686747,
"acc_norm_stderr": 0.03882310850890594
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.03094445977853321,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.03094445977853321
},
"harness|truthfulqa:mc|0": {
"mc1": 0.33414932680538556,
"mc1_stderr": 0.016512530677150538,
"mc2": 0.5172148516529776,
"mc2_stderr": 0.015636005438812176
},
"harness|winogrande|5": {
"acc": 0.6977111286503551,
"acc_stderr": 0.012907200361627532
},
"harness|gsm8k|5": {
"acc": 0.41698256254738436,
"acc_stderr": 0.013581320997216591
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_elinas__chronos-mistral-7b | open-llm-leaderboard-old | "2024-04-09T06:42:31Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T06:42:03Z" | ---
pretty_name: Evaluation run of elinas/chronos-mistral-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [elinas/chronos-mistral-7b](https://huggingface.co/elinas/chronos-mistral-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_elinas__chronos-mistral-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T06:39:41.464301](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-mistral-7b/blob/main/results_2024-04-09T06-39-41.464301.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4943261332869945,\n\
\ \"acc_stderr\": 0.03440242385841512,\n \"acc_norm\": 0.4990977698278415,\n\
\ \"acc_norm_stderr\": 0.035152578733964476,\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.48059222372011373,\n\
\ \"mc2_stderr\": 0.014984088747615087\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5273037542662116,\n \"acc_stderr\": 0.014589589101985989,\n\
\ \"acc_norm\": 0.5580204778156996,\n \"acc_norm_stderr\": 0.014512682523128342\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5751842262497511,\n\
\ \"acc_stderr\": 0.004933047726996793,\n \"acc_norm\": 0.7719577773351922,\n\
\ \"acc_norm_stderr\": 0.004187124964848515\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4740740740740741,\n\
\ \"acc_stderr\": 0.04313531696750574,\n \"acc_norm\": 0.4740740740740741,\n\
\ \"acc_norm_stderr\": 0.04313531696750574\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.4473684210526316,\n \"acc_stderr\": 0.04046336883978251,\n\
\ \"acc_norm\": 0.4473684210526316,\n \"acc_norm_stderr\": 0.04046336883978251\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.47,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.47,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5018867924528302,\n \"acc_stderr\": 0.030772653642075657,\n\
\ \"acc_norm\": 0.5018867924528302,\n \"acc_norm_stderr\": 0.030772653642075657\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5277777777777778,\n\
\ \"acc_stderr\": 0.04174752578923185,\n \"acc_norm\": 0.5277777777777778,\n\
\ \"acc_norm_stderr\": 0.04174752578923185\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.4508670520231214,\n\
\ \"acc_stderr\": 0.03794012674697029,\n \"acc_norm\": 0.4508670520231214,\n\
\ \"acc_norm_stderr\": 0.03794012674697029\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\": 0.66,\n\
\ \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.41702127659574467,\n \"acc_stderr\": 0.032232762667117124,\n\
\ \"acc_norm\": 0.41702127659574467,\n \"acc_norm_stderr\": 0.032232762667117124\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31746031746031744,\n \"acc_stderr\": 0.023973861998992062,\n \"\
acc_norm\": 0.31746031746031744,\n \"acc_norm_stderr\": 0.023973861998992062\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.0404061017820884,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.0404061017820884\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5935483870967742,\n\
\ \"acc_stderr\": 0.02794172734625631,\n \"acc_norm\": 0.5935483870967742,\n\
\ \"acc_norm_stderr\": 0.02794172734625631\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.35467980295566504,\n \"acc_stderr\": 0.0336612448905145,\n\
\ \"acc_norm\": 0.35467980295566504,\n \"acc_norm_stderr\": 0.0336612448905145\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6242424242424243,\n \"acc_stderr\": 0.03781887353205982,\n\
\ \"acc_norm\": 0.6242424242424243,\n \"acc_norm_stderr\": 0.03781887353205982\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.5757575757575758,\n \"acc_stderr\": 0.03521224908841586,\n \"\
acc_norm\": 0.5757575757575758,\n \"acc_norm_stderr\": 0.03521224908841586\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6994818652849741,\n \"acc_stderr\": 0.0330881859441575,\n\
\ \"acc_norm\": 0.6994818652849741,\n \"acc_norm_stderr\": 0.0330881859441575\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.44871794871794873,\n \"acc_stderr\": 0.025217315184846486,\n\
\ \"acc_norm\": 0.44871794871794873,\n \"acc_norm_stderr\": 0.025217315184846486\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911498,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911498\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.453781512605042,\n \"acc_stderr\": 0.03233943468182088,\n \
\ \"acc_norm\": 0.453781512605042,\n \"acc_norm_stderr\": 0.03233943468182088\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526731,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526731\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.6458715596330276,\n \"acc_stderr\": 0.020504729013829125,\n \"\
acc_norm\": 0.6458715596330276,\n \"acc_norm_stderr\": 0.020504729013829125\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6519607843137255,\n \"acc_stderr\": 0.03343311240488418,\n \"\
acc_norm\": 0.6519607843137255,\n \"acc_norm_stderr\": 0.03343311240488418\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5991561181434599,\n \"acc_stderr\": 0.03190080389473235,\n \
\ \"acc_norm\": 0.5991561181434599,\n \"acc_norm_stderr\": 0.03190080389473235\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6053811659192825,\n\
\ \"acc_stderr\": 0.03280400504755291,\n \"acc_norm\": 0.6053811659192825,\n\
\ \"acc_norm_stderr\": 0.03280400504755291\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.48854961832061067,\n \"acc_stderr\": 0.043841400240780176,\n\
\ \"acc_norm\": 0.48854961832061067,\n \"acc_norm_stderr\": 0.043841400240780176\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6859504132231405,\n \"acc_stderr\": 0.04236964753041018,\n \"\
acc_norm\": 0.6859504132231405,\n \"acc_norm_stderr\": 0.04236964753041018\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5462962962962963,\n\
\ \"acc_stderr\": 0.04812917324536824,\n \"acc_norm\": 0.5462962962962963,\n\
\ \"acc_norm_stderr\": 0.04812917324536824\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.5766871165644172,\n \"acc_stderr\": 0.03881891213334384,\n\
\ \"acc_norm\": 0.5766871165644172,\n \"acc_norm_stderr\": 0.03881891213334384\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7991452991452992,\n\
\ \"acc_stderr\": 0.02624677294689048,\n \"acc_norm\": 0.7991452991452992,\n\
\ \"acc_norm_stderr\": 0.02624677294689048\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.05021167315686779\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6768837803320562,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.6768837803320562,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4624277456647399,\n \"acc_stderr\": 0.026842985519615375,\n\
\ \"acc_norm\": 0.4624277456647399,\n \"acc_norm_stderr\": 0.026842985519615375\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.37988826815642457,\n\
\ \"acc_stderr\": 0.016232826818678492,\n \"acc_norm\": 0.37988826815642457,\n\
\ \"acc_norm_stderr\": 0.016232826818678492\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5098039215686274,\n \"acc_stderr\": 0.028624412550167958,\n\
\ \"acc_norm\": 0.5098039215686274,\n \"acc_norm_stderr\": 0.028624412550167958\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5466237942122186,\n\
\ \"acc_stderr\": 0.02827435985489424,\n \"acc_norm\": 0.5466237942122186,\n\
\ \"acc_norm_stderr\": 0.02827435985489424\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5462962962962963,\n \"acc_stderr\": 0.0277012284685426,\n\
\ \"acc_norm\": 0.5462962962962963,\n \"acc_norm_stderr\": 0.0277012284685426\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3404255319148936,\n \"acc_stderr\": 0.028267657482650144,\n \
\ \"acc_norm\": 0.3404255319148936,\n \"acc_norm_stderr\": 0.028267657482650144\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.363754889178618,\n\
\ \"acc_stderr\": 0.012286991879902887,\n \"acc_norm\": 0.363754889178618,\n\
\ \"acc_norm_stderr\": 0.012286991879902887\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4852941176470588,\n \"acc_stderr\": 0.03035969707904612,\n\
\ \"acc_norm\": 0.4852941176470588,\n \"acc_norm_stderr\": 0.03035969707904612\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4738562091503268,\n \"acc_stderr\": 0.020200164564804588,\n \
\ \"acc_norm\": 0.4738562091503268,\n \"acc_norm_stderr\": 0.020200164564804588\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.0469237132203465,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.0469237132203465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5224489795918368,\n \"acc_stderr\": 0.03197694118713672,\n\
\ \"acc_norm\": 0.5224489795918368,\n \"acc_norm_stderr\": 0.03197694118713672\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5920398009950248,\n\
\ \"acc_stderr\": 0.03475116365194092,\n \"acc_norm\": 0.5920398009950248,\n\
\ \"acc_norm_stderr\": 0.03475116365194092\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.45180722891566266,\n\
\ \"acc_stderr\": 0.038743715565879536,\n \"acc_norm\": 0.45180722891566266,\n\
\ \"acc_norm_stderr\": 0.038743715565879536\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6900584795321637,\n \"acc_stderr\": 0.035469769593931624,\n\
\ \"acc_norm\": 0.6900584795321637,\n \"acc_norm_stderr\": 0.035469769593931624\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.31701346389228885,\n\
\ \"mc1_stderr\": 0.01628920337440338,\n \"mc2\": 0.48059222372011373,\n\
\ \"mc2_stderr\": 0.014984088747615087\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7261247040252565,\n \"acc_stderr\": 0.012533292732620296\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.20849128127369218,\n \
\ \"acc_stderr\": 0.011189587985791425\n }\n}\n```"
repo_url: https://huggingface.co/elinas/chronos-mistral-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-39-41.464301.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- '**/details_harness|winogrande|5_2024-04-09T06-39-41.464301.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T06-39-41.464301.parquet'
- config_name: results
data_files:
- split: 2024_04_09T06_39_41.464301
path:
- results_2024-04-09T06-39-41.464301.parquet
- split: latest
path:
- results_2024-04-09T06-39-41.464301.parquet
---
# Dataset Card for Evaluation run of elinas/chronos-mistral-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [elinas/chronos-mistral-7b](https://huggingface.co/elinas/chronos-mistral-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_elinas__chronos-mistral-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T06:39:41.464301](https://huggingface.co/datasets/open-llm-leaderboard/details_elinas__chronos-mistral-7b/blob/main/results_2024-04-09T06-39-41.464301.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4943261332869945,
"acc_stderr": 0.03440242385841512,
"acc_norm": 0.4990977698278415,
"acc_norm_stderr": 0.035152578733964476,
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.48059222372011373,
"mc2_stderr": 0.014984088747615087
},
"harness|arc:challenge|25": {
"acc": 0.5273037542662116,
"acc_stderr": 0.014589589101985989,
"acc_norm": 0.5580204778156996,
"acc_norm_stderr": 0.014512682523128342
},
"harness|hellaswag|10": {
"acc": 0.5751842262497511,
"acc_stderr": 0.004933047726996793,
"acc_norm": 0.7719577773351922,
"acc_norm_stderr": 0.004187124964848515
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4740740740740741,
"acc_stderr": 0.04313531696750574,
"acc_norm": 0.4740740740740741,
"acc_norm_stderr": 0.04313531696750574
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04046336883978251,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04046336883978251
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5018867924528302,
"acc_stderr": 0.030772653642075657,
"acc_norm": 0.5018867924528302,
"acc_norm_stderr": 0.030772653642075657
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.04174752578923185,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.04174752578923185
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.4508670520231214,
"acc_stderr": 0.03794012674697029,
"acc_norm": 0.4508670520231214,
"acc_norm_stderr": 0.03794012674697029
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.41702127659574467,
"acc_stderr": 0.032232762667117124,
"acc_norm": 0.41702127659574467,
"acc_norm_stderr": 0.032232762667117124
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31746031746031744,
"acc_stderr": 0.023973861998992062,
"acc_norm": 0.31746031746031744,
"acc_norm_stderr": 0.023973861998992062
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.0404061017820884,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.0404061017820884
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5935483870967742,
"acc_stderr": 0.02794172734625631,
"acc_norm": 0.5935483870967742,
"acc_norm_stderr": 0.02794172734625631
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.35467980295566504,
"acc_stderr": 0.0336612448905145,
"acc_norm": 0.35467980295566504,
"acc_norm_stderr": 0.0336612448905145
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6242424242424243,
"acc_stderr": 0.03781887353205982,
"acc_norm": 0.6242424242424243,
"acc_norm_stderr": 0.03781887353205982
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.5757575757575758,
"acc_stderr": 0.03521224908841586,
"acc_norm": 0.5757575757575758,
"acc_norm_stderr": 0.03521224908841586
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6994818652849741,
"acc_stderr": 0.0330881859441575,
"acc_norm": 0.6994818652849741,
"acc_norm_stderr": 0.0330881859441575
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.44871794871794873,
"acc_stderr": 0.025217315184846486,
"acc_norm": 0.44871794871794873,
"acc_norm_stderr": 0.025217315184846486
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911498,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911498
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.453781512605042,
"acc_stderr": 0.03233943468182088,
"acc_norm": 0.453781512605042,
"acc_norm_stderr": 0.03233943468182088
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526731,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526731
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.6458715596330276,
"acc_stderr": 0.020504729013829125,
"acc_norm": 0.6458715596330276,
"acc_norm_stderr": 0.020504729013829125
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6519607843137255,
"acc_stderr": 0.03343311240488418,
"acc_norm": 0.6519607843137255,
"acc_norm_stderr": 0.03343311240488418
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5991561181434599,
"acc_stderr": 0.03190080389473235,
"acc_norm": 0.5991561181434599,
"acc_norm_stderr": 0.03190080389473235
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6053811659192825,
"acc_stderr": 0.03280400504755291,
"acc_norm": 0.6053811659192825,
"acc_norm_stderr": 0.03280400504755291
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.48854961832061067,
"acc_stderr": 0.043841400240780176,
"acc_norm": 0.48854961832061067,
"acc_norm_stderr": 0.043841400240780176
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6859504132231405,
"acc_stderr": 0.04236964753041018,
"acc_norm": 0.6859504132231405,
"acc_norm_stderr": 0.04236964753041018
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.04812917324536824,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.04812917324536824
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.5766871165644172,
"acc_stderr": 0.03881891213334384,
"acc_norm": 0.5766871165644172,
"acc_norm_stderr": 0.03881891213334384
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7991452991452992,
"acc_stderr": 0.02624677294689048,
"acc_norm": 0.7991452991452992,
"acc_norm_stderr": 0.02624677294689048
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6768837803320562,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.6768837803320562,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4624277456647399,
"acc_stderr": 0.026842985519615375,
"acc_norm": 0.4624277456647399,
"acc_norm_stderr": 0.026842985519615375
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.37988826815642457,
"acc_stderr": 0.016232826818678492,
"acc_norm": 0.37988826815642457,
"acc_norm_stderr": 0.016232826818678492
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5098039215686274,
"acc_stderr": 0.028624412550167958,
"acc_norm": 0.5098039215686274,
"acc_norm_stderr": 0.028624412550167958
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5466237942122186,
"acc_stderr": 0.02827435985489424,
"acc_norm": 0.5466237942122186,
"acc_norm_stderr": 0.02827435985489424
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5462962962962963,
"acc_stderr": 0.0277012284685426,
"acc_norm": 0.5462962962962963,
"acc_norm_stderr": 0.0277012284685426
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3404255319148936,
"acc_stderr": 0.028267657482650144,
"acc_norm": 0.3404255319148936,
"acc_norm_stderr": 0.028267657482650144
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.363754889178618,
"acc_stderr": 0.012286991879902887,
"acc_norm": 0.363754889178618,
"acc_norm_stderr": 0.012286991879902887
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4852941176470588,
"acc_stderr": 0.03035969707904612,
"acc_norm": 0.4852941176470588,
"acc_norm_stderr": 0.03035969707904612
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4738562091503268,
"acc_stderr": 0.020200164564804588,
"acc_norm": 0.4738562091503268,
"acc_norm_stderr": 0.020200164564804588
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6,
"acc_stderr": 0.0469237132203465,
"acc_norm": 0.6,
"acc_norm_stderr": 0.0469237132203465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5224489795918368,
"acc_stderr": 0.03197694118713672,
"acc_norm": 0.5224489795918368,
"acc_norm_stderr": 0.03197694118713672
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5920398009950248,
"acc_stderr": 0.03475116365194092,
"acc_norm": 0.5920398009950248,
"acc_norm_stderr": 0.03475116365194092
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-virology|5": {
"acc": 0.45180722891566266,
"acc_stderr": 0.038743715565879536,
"acc_norm": 0.45180722891566266,
"acc_norm_stderr": 0.038743715565879536
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6900584795321637,
"acc_stderr": 0.035469769593931624,
"acc_norm": 0.6900584795321637,
"acc_norm_stderr": 0.035469769593931624
},
"harness|truthfulqa:mc|0": {
"mc1": 0.31701346389228885,
"mc1_stderr": 0.01628920337440338,
"mc2": 0.48059222372011373,
"mc2_stderr": 0.014984088747615087
},
"harness|winogrande|5": {
"acc": 0.7261247040252565,
"acc_stderr": 0.012533292732620296
},
"harness|gsm8k|5": {
"acc": 0.20849128127369218,
"acc_stderr": 0.011189587985791425
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_nbeerbower__bophades-mistral-math-DPO-7B | open-llm-leaderboard-old | "2024-04-09T06:47:07Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T06:46:10Z" | ---
pretty_name: Evaluation run of nbeerbower/bophades-mistral-math-DPO-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/bophades-mistral-math-DPO-7B](https://huggingface.co/nbeerbower/bophades-mistral-math-DPO-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__bophades-mistral-math-DPO-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T06:43:49.687940](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bophades-mistral-math-DPO-7B/blob/main/results_2024-04-09T06-43-49.687940.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6528482939191733,\n\
\ \"acc_stderr\": 0.03209418858852385,\n \"acc_norm\": 0.651783654728172,\n\
\ \"acc_norm_stderr\": 0.03277391168213738,\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7749888585200013,\n\
\ \"mc2_stderr\": 0.013848595999672798\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7192832764505119,\n \"acc_stderr\": 0.01313123812697558,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7189802828121888,\n\
\ \"acc_stderr\": 0.004485784468576664,\n \"acc_norm\": 0.8902609042023502,\n\
\ \"acc_norm_stderr\": 0.0031192548288489484\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923996,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923996\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.43575418994413406,\n\
\ \"acc_stderr\": 0.01658388195860239,\n \"acc_norm\": 0.43575418994413406,\n\
\ \"acc_norm_stderr\": 0.01658388195860239\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.025829163272757482,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.025829163272757482\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188933,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188933\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47392438070404175,\n\
\ \"acc_stderr\": 0.012752858346533127,\n \"acc_norm\": 0.47392438070404175,\n\
\ \"acc_norm_stderr\": 0.012752858346533127\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6217870257037944,\n\
\ \"mc1_stderr\": 0.016976335907546866,\n \"mc2\": 0.7749888585200013,\n\
\ \"mc2_stderr\": 0.013848595999672798\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8555643251775849,\n \"acc_stderr\": 0.009879767358079232\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6944655041698257,\n \
\ \"acc_stderr\": 0.012688134076726879\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/bophades-mistral-math-DPO-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-43-49.687940.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-43-49.687940.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- '**/details_harness|winogrande|5_2024-04-09T06-43-49.687940.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T06-43-49.687940.parquet'
- config_name: results
data_files:
- split: 2024_04_09T06_43_49.687940
path:
- results_2024-04-09T06-43-49.687940.parquet
- split: latest
path:
- results_2024-04-09T06-43-49.687940.parquet
---
# Dataset Card for Evaluation run of nbeerbower/bophades-mistral-math-DPO-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/bophades-mistral-math-DPO-7B](https://huggingface.co/nbeerbower/bophades-mistral-math-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__bophades-mistral-math-DPO-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T06:43:49.687940](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bophades-mistral-math-DPO-7B/blob/main/results_2024-04-09T06-43-49.687940.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6528482939191733,
"acc_stderr": 0.03209418858852385,
"acc_norm": 0.651783654728172,
"acc_norm_stderr": 0.03277391168213738,
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7749888585200013,
"mc2_stderr": 0.013848595999672798
},
"harness|arc:challenge|25": {
"acc": 0.7192832764505119,
"acc_stderr": 0.01313123812697558,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7189802828121888,
"acc_stderr": 0.004485784468576664,
"acc_norm": 0.8902609042023502,
"acc_norm_stderr": 0.0031192548288489484
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923996,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923996
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.43575418994413406,
"acc_stderr": 0.01658388195860239,
"acc_norm": 0.43575418994413406,
"acc_norm_stderr": 0.01658388195860239
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.025829163272757482,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.025829163272757482
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188933,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188933
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47392438070404175,
"acc_stderr": 0.012752858346533127,
"acc_norm": 0.47392438070404175,
"acc_norm_stderr": 0.012752858346533127
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6217870257037944,
"mc1_stderr": 0.016976335907546866,
"mc2": 0.7749888585200013,
"mc2_stderr": 0.013848595999672798
},
"harness|winogrande|5": {
"acc": 0.8555643251775849,
"acc_stderr": 0.009879767358079232
},
"harness|gsm8k|5": {
"acc": 0.6944655041698257,
"acc_stderr": 0.012688134076726879
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_nbeerbower__bophades-mistral-truthy-DPO-7B | open-llm-leaderboard-old | "2024-04-09T06:48:25Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T06:46:45Z" | ---
pretty_name: Evaluation run of nbeerbower/bophades-mistral-truthy-DPO-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/bophades-mistral-truthy-DPO-7B](https://huggingface.co/nbeerbower/bophades-mistral-truthy-DPO-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__bophades-mistral-truthy-DPO-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T06:44:24.324049](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bophades-mistral-truthy-DPO-7B/blob/main/results_2024-04-09T06-44-24.324049.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6537827052889121,\n\
\ \"acc_stderr\": 0.03209827481174341,\n \"acc_norm\": 0.652963009270233,\n\
\ \"acc_norm_stderr\": 0.03277384510844206,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7788431978474983,\n\
\ \"mc2_stderr\": 0.01377002928792248\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523197\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7212706632144991,\n\
\ \"acc_stderr\": 0.004474577054517446,\n \"acc_norm\": 0.8927504481179048,\n\
\ \"acc_norm_stderr\": 0.0030879787141283705\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106136,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106136\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644237,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644237\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606649,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606649\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4346368715083799,\n\
\ \"acc_stderr\": 0.016578997435496713,\n \"acc_norm\": 0.4346368715083799,\n\
\ \"acc_norm_stderr\": 0.016578997435496713\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823696,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823696\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7428571428571429,\n \"acc_stderr\": 0.02797982353874455,\n\
\ \"acc_norm\": 0.7428571428571429,\n \"acc_norm_stderr\": 0.02797982353874455\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7788431978474983,\n\
\ \"mc2_stderr\": 0.01377002928792248\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.009990706005184136\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.01269693010656291\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/bophades-mistral-truthy-DPO-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-44-24.324049.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-44-24.324049.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- '**/details_harness|winogrande|5_2024-04-09T06-44-24.324049.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T06-44-24.324049.parquet'
- config_name: results
data_files:
- split: 2024_04_09T06_44_24.324049
path:
- results_2024-04-09T06-44-24.324049.parquet
- split: latest
path:
- results_2024-04-09T06-44-24.324049.parquet
---
# Dataset Card for Evaluation run of nbeerbower/bophades-mistral-truthy-DPO-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/bophades-mistral-truthy-DPO-7B](https://huggingface.co/nbeerbower/bophades-mistral-truthy-DPO-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__bophades-mistral-truthy-DPO-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T06:44:24.324049](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__bophades-mistral-truthy-DPO-7B/blob/main/results_2024-04-09T06-44-24.324049.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6537827052889121,
"acc_stderr": 0.03209827481174341,
"acc_norm": 0.652963009270233,
"acc_norm_stderr": 0.03277384510844206,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7788431978474983,
"mc2_stderr": 0.01377002928792248
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523197
},
"harness|hellaswag|10": {
"acc": 0.7212706632144991,
"acc_stderr": 0.004474577054517446,
"acc_norm": 0.8927504481179048,
"acc_norm_stderr": 0.0030879787141283705
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106136,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106136
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644237,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644237
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606649,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606649
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4346368715083799,
"acc_stderr": 0.016578997435496713,
"acc_norm": 0.4346368715083799,
"acc_norm_stderr": 0.016578997435496713
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823696,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823696
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7428571428571429,
"acc_stderr": 0.02797982353874455,
"acc_norm": 0.7428571428571429,
"acc_norm_stderr": 0.02797982353874455
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7788431978474983,
"mc2_stderr": 0.01377002928792248
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.009990706005184136
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.01269693010656291
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
HeonWoo22/my_dataset | HeonWoo22 | "2024-04-11T08:52:27Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T06:48:31Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: text
dtype: string
splits:
- name: train
num_bytes: 1419447.0
num_examples: 63
download_size: 1418242
dataset_size: 1419447.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_netcat420__MFANNv0.4 | open-llm-leaderboard-old | "2024-04-09T06:50:29Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T06:50:08Z" | ---
pretty_name: Evaluation run of netcat420/MFANNv0.4
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [netcat420/MFANNv0.4](https://huggingface.co/netcat420/MFANNv0.4) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_netcat420__MFANNv0.4\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T06:47:38.111444](https://huggingface.co/datasets/open-llm-leaderboard/details_netcat420__MFANNv0.4/blob/main/results_2024-04-09T06-47-38.111444.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6371943268205861,\n\
\ \"acc_stderr\": 0.03246901392972694,\n \"acc_norm\": 0.6377161445813604,\n\
\ \"acc_norm_stderr\": 0.03312827704029973,\n \"mc1\": 0.5336597307221542,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.7139881282663555,\n\
\ \"mc2_stderr\": 0.01519479061727556\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6706484641638225,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.6953924914675768,\n \"acc_norm_stderr\": 0.013449522109932483\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7017526389165505,\n\
\ \"acc_stderr\": 0.004565536808632543,\n \"acc_norm\": 0.8665604461262697,\n\
\ \"acc_norm_stderr\": 0.003393542074227652\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"acc_norm_stderr\"\
: 0.05\n },\n \"harness|hendrycksTest-clinical_knowledge|5\": {\n \"\
acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n \
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7430555555555556,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.7430555555555556,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\"\
: 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \"acc_norm\": 0.34,\n\
\ \"acc_norm_stderr\": 0.04760952285695235\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.653179190751445,\n \"acc_stderr\": 0.036291466701596636,\n\
\ \"acc_norm\": 0.653179190751445,\n \"acc_norm_stderr\": 0.036291466701596636\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5319148936170213,\n\
\ \"acc_stderr\": 0.03261936918467381,\n \"acc_norm\": 0.5319148936170213,\n\
\ \"acc_norm_stderr\": 0.03261936918467381\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.45614035087719296,\n \"acc_stderr\": 0.046854730419077895,\n\
\ \"acc_norm\": 0.45614035087719296,\n \"acc_norm_stderr\": 0.046854730419077895\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n \"acc_norm\"\
: 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n },\n\
\ \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\": 0.42592592592592593,\n\
\ \"acc_stderr\": 0.02546714904546955,\n \"acc_norm\": 0.42592592592592593,\n\
\ \"acc_norm_stderr\": 0.02546714904546955\n },\n \"harness|hendrycksTest-formal_logic|5\"\
: {\n \"acc\": 0.3968253968253968,\n \"acc_stderr\": 0.043758884927270605,\n\
\ \"acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.043758884927270605\n\
\ },\n \"harness|hendrycksTest-global_facts|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-high_school_biology|5\"\
: {\n \"acc\": 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n\
\ \"acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131137,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131137\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4105960264900662,\n \"acc_stderr\": 0.04016689594849929,\n \"\
acc_norm\": 0.4105960264900662,\n \"acc_norm_stderr\": 0.04016689594849929\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848036,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848036\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026156867523931045,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026156867523931045\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7974683544303798,\n \"acc_stderr\": 0.026160568246601436,\n \
\ \"acc_norm\": 0.7974683544303798,\n \"acc_norm_stderr\": 0.026160568246601436\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.04058042015646034,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.04058042015646034\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.01389086216287616,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.01389086216287616\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.38324022346368714,\n\
\ \"acc_stderr\": 0.016260159604429128,\n \"acc_norm\": 0.38324022346368714,\n\
\ \"acc_norm_stderr\": 0.016260159604429128\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886324,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083138,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083138\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411945,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411945\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6633986928104575,\n \"acc_stderr\": 0.019117213911495148,\n \
\ \"acc_norm\": 0.6633986928104575,\n \"acc_norm_stderr\": 0.019117213911495148\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.0293936093198798,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.0293936093198798\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.038641399236991225,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.038641399236991225\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5336597307221542,\n\
\ \"mc1_stderr\": 0.017463793867168106,\n \"mc2\": 0.7139881282663555,\n\
\ \"mc2_stderr\": 0.01519479061727556\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7947908445146015,\n \"acc_stderr\": 0.011350315707462063\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6383623957543594,\n \
\ \"acc_stderr\": 0.013234658351088776\n }\n}\n```"
repo_url: https://huggingface.co/netcat420/MFANNv0.4
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-47-38.111444.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T06-47-38.111444.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- '**/details_harness|winogrande|5_2024-04-09T06-47-38.111444.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T06-47-38.111444.parquet'
- config_name: results
data_files:
- split: 2024_04_09T06_47_38.111444
path:
- results_2024-04-09T06-47-38.111444.parquet
- split: latest
path:
- results_2024-04-09T06-47-38.111444.parquet
---
# Dataset Card for Evaluation run of netcat420/MFANNv0.4
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [netcat420/MFANNv0.4](https://huggingface.co/netcat420/MFANNv0.4) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_netcat420__MFANNv0.4",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T06:47:38.111444](https://huggingface.co/datasets/open-llm-leaderboard/details_netcat420__MFANNv0.4/blob/main/results_2024-04-09T06-47-38.111444.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6371943268205861,
"acc_stderr": 0.03246901392972694,
"acc_norm": 0.6377161445813604,
"acc_norm_stderr": 0.03312827704029973,
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.7139881282663555,
"mc2_stderr": 0.01519479061727556
},
"harness|arc:challenge|25": {
"acc": 0.6706484641638225,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.6953924914675768,
"acc_norm_stderr": 0.013449522109932483
},
"harness|hellaswag|10": {
"acc": 0.7017526389165505,
"acc_stderr": 0.004565536808632543,
"acc_norm": 0.8665604461262697,
"acc_norm_stderr": 0.003393542074227652
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7430555555555556,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.7430555555555556,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467381,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467381
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131137,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131137
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4105960264900662,
"acc_stderr": 0.04016689594849929,
"acc_norm": 0.4105960264900662,
"acc_norm_stderr": 0.04016689594849929
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848036,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848036
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026156867523931045,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026156867523931045
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7974683544303798,
"acc_stderr": 0.026160568246601436,
"acc_norm": 0.7974683544303798,
"acc_norm_stderr": 0.026160568246601436
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.04058042015646034,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.04058042015646034
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.01389086216287616,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.01389086216287616
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.38324022346368714,
"acc_stderr": 0.016260159604429128,
"acc_norm": 0.38324022346368714,
"acc_norm_stderr": 0.016260159604429128
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886324,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083138,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083138
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411945,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411945
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6633986928104575,
"acc_stderr": 0.019117213911495148,
"acc_norm": 0.6633986928104575,
"acc_norm_stderr": 0.019117213911495148
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.0293936093198798,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.0293936093198798
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.038641399236991225,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.038641399236991225
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5336597307221542,
"mc1_stderr": 0.017463793867168106,
"mc2": 0.7139881282663555,
"mc2_stderr": 0.01519479061727556
},
"harness|winogrande|5": {
"acc": 0.7947908445146015,
"acc_stderr": 0.011350315707462063
},
"harness|gsm8k|5": {
"acc": 0.6383623957543594,
"acc_stderr": 0.013234658351088776
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_nbeerbower__HeroBophades-2x7B | open-llm-leaderboard-old | "2024-04-09T07:22:07Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T07:21:46Z" | ---
pretty_name: Evaluation run of nbeerbower/HeroBophades-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/HeroBophades-2x7B](https://huggingface.co/nbeerbower/HeroBophades-2x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T07:19:29.226434](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B/blob/main/results_2024-04-09T07-19-29.226434.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6530925043155706,\n\
\ \"acc_stderr\": 0.0321263146530597,\n \"acc_norm\": 0.6521487655754685,\n\
\ \"acc_norm_stderr\": 0.03280429774578277,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7786555473070617,\n\
\ \"mc2_stderr\": 0.013750818263207308\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710698\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.722266480780721,\n\
\ \"acc_stderr\": 0.004469659042824775,\n \"acc_norm\": 0.8911571400119498,\n\
\ \"acc_norm_stderr\": 0.0031080545633521105\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7916666666666666,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.7916666666666666,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n\
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8422018348623853,\n \"acc_stderr\": 0.01563002297009244,\n \"\
acc_norm\": 0.8422018348623853,\n \"acc_norm_stderr\": 0.01563002297009244\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.016558601636041035,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.016558601636041035\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.025917806117147158,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.025917806117147158\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823698,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7786555473070617,\n\
\ \"mc2_stderr\": 0.013750818263207308\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8524072612470402,\n \"acc_stderr\": 0.00996871576547965\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6937073540561031,\n \
\ \"acc_stderr\": 0.012696930106562906\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/HeroBophades-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-19-29.226434.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- '**/details_harness|winogrande|5_2024-04-09T07-19-29.226434.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T07-19-29.226434.parquet'
- config_name: results
data_files:
- split: 2024_04_09T07_19_29.226434
path:
- results_2024-04-09T07-19-29.226434.parquet
- split: latest
path:
- results_2024-04-09T07-19-29.226434.parquet
---
# Dataset Card for Evaluation run of nbeerbower/HeroBophades-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/HeroBophades-2x7B](https://huggingface.co/nbeerbower/HeroBophades-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T07:19:29.226434](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__HeroBophades-2x7B/blob/main/results_2024-04-09T07-19-29.226434.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6530925043155706,
"acc_stderr": 0.0321263146530597,
"acc_norm": 0.6521487655754685,
"acc_norm_stderr": 0.03280429774578277,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7786555473070617,
"mc2_stderr": 0.013750818263207308
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710698
},
"harness|hellaswag|10": {
"acc": 0.722266480780721,
"acc_stderr": 0.004469659042824775,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.0031080545633521105
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7916666666666666,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.7916666666666666,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8422018348623853,
"acc_stderr": 0.01563002297009244,
"acc_norm": 0.8422018348623853,
"acc_norm_stderr": 0.01563002297009244
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.016558601636041035,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.016558601636041035
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.025917806117147158,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.025917806117147158
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823698,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7786555473070617,
"mc2_stderr": 0.013750818263207308
},
"harness|winogrande|5": {
"acc": 0.8524072612470402,
"acc_stderr": 0.00996871576547965
},
"harness|gsm8k|5": {
"acc": 0.6937073540561031,
"acc_stderr": 0.012696930106562906
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_nbeerbower__SuperFlammen-4x7B | open-llm-leaderboard-old | "2024-04-09T07:40:15Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T07:39:55Z" | ---
pretty_name: Evaluation run of nbeerbower/SuperFlammen-4x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/SuperFlammen-4x7B](https://huggingface.co/nbeerbower/SuperFlammen-4x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__SuperFlammen-4x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T07:37:31.585110](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__SuperFlammen-4x7B/blob/main/results_2024-04-09T07-37-31.585110.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6543689155321306,\n\
\ \"acc_stderr\": 0.032105259979613214,\n \"acc_norm\": 0.6538825431007639,\n\
\ \"acc_norm_stderr\": 0.03277387241517634,\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107478,\n \"mc2\": 0.7264590049689311,\n\
\ \"mc2_stderr\": 0.014648935765095752\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6962457337883959,\n \"acc_stderr\": 0.013438909184778764,\n\
\ \"acc_norm\": 0.7226962457337884,\n \"acc_norm_stderr\": 0.013082095839059376\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7105158334993029,\n\
\ \"acc_stderr\": 0.0045259609655517044,\n \"acc_norm\": 0.8850826528579964,\n\
\ \"acc_norm_stderr\": 0.003182703830351134\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.024121125416941197,\n\
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.024121125416941197\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.02874204090394848,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.02874204090394848\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461783,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461783\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.025524722324553346,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.025524722324553346\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.031024411740572213,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.031024411740572213\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.03322015795776741,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.03322015795776741\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.021586494001281365,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.021586494001281365\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903335,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903335\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258176,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4435754189944134,\n\
\ \"acc_stderr\": 0.01661568040100372,\n \"acc_norm\": 0.4435754189944134,\n\
\ \"acc_norm_stderr\": 0.01661568040100372\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47783572359843546,\n\
\ \"acc_stderr\": 0.012757683047716175,\n \"acc_norm\": 0.47783572359843546,\n\
\ \"acc_norm_stderr\": 0.012757683047716175\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.028332959514031208,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.028332959514031208\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6830065359477124,\n \"acc_stderr\": 0.018824219512706207,\n \
\ \"acc_norm\": 0.6830065359477124,\n \"acc_norm_stderr\": 0.018824219512706207\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5667074663402693,\n\
\ \"mc1_stderr\": 0.017347024450107478,\n \"mc2\": 0.7264590049689311,\n\
\ \"mc2_stderr\": 0.014648935765095752\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8310970797158642,\n \"acc_stderr\": 0.010529981411838911\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7073540561031084,\n \
\ \"acc_stderr\": 0.012532334368242906\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/SuperFlammen-4x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-37-31.585110.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T07-37-31.585110.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- '**/details_harness|winogrande|5_2024-04-09T07-37-31.585110.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T07-37-31.585110.parquet'
- config_name: results
data_files:
- split: 2024_04_09T07_37_31.585110
path:
- results_2024-04-09T07-37-31.585110.parquet
- split: latest
path:
- results_2024-04-09T07-37-31.585110.parquet
---
# Dataset Card for Evaluation run of nbeerbower/SuperFlammen-4x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/SuperFlammen-4x7B](https://huggingface.co/nbeerbower/SuperFlammen-4x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__SuperFlammen-4x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T07:37:31.585110](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__SuperFlammen-4x7B/blob/main/results_2024-04-09T07-37-31.585110.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6543689155321306,
"acc_stderr": 0.032105259979613214,
"acc_norm": 0.6538825431007639,
"acc_norm_stderr": 0.03277387241517634,
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107478,
"mc2": 0.7264590049689311,
"mc2_stderr": 0.014648935765095752
},
"harness|arc:challenge|25": {
"acc": 0.6962457337883959,
"acc_stderr": 0.013438909184778764,
"acc_norm": 0.7226962457337884,
"acc_norm_stderr": 0.013082095839059376
},
"harness|hellaswag|10": {
"acc": 0.7105158334993029,
"acc_stderr": 0.0045259609655517044,
"acc_norm": 0.8850826528579964,
"acc_norm_stderr": 0.003182703830351134
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.024121125416941197,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.024121125416941197
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.02874204090394848,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.02874204090394848
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461783,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461783
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.025524722324553346,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.025524722324553346
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.031024411740572213,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.031024411740572213
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.021586494001281365,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.021586494001281365
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903335,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903335
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258176,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4435754189944134,
"acc_stderr": 0.01661568040100372,
"acc_norm": 0.4435754189944134,
"acc_norm_stderr": 0.01661568040100372
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47783572359843546,
"acc_stderr": 0.012757683047716175,
"acc_norm": 0.47783572359843546,
"acc_norm_stderr": 0.012757683047716175
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.028332959514031208,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.028332959514031208
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6830065359477124,
"acc_stderr": 0.018824219512706207,
"acc_norm": 0.6830065359477124,
"acc_norm_stderr": 0.018824219512706207
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5667074663402693,
"mc1_stderr": 0.017347024450107478,
"mc2": 0.7264590049689311,
"mc2_stderr": 0.014648935765095752
},
"harness|winogrande|5": {
"acc": 0.8310970797158642,
"acc_stderr": 0.010529981411838911
},
"harness|gsm8k|5": {
"acc": 0.7073540561031084,
"acc_stderr": 0.012532334368242906
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
malaysia-ai/crawl-youtube-mandarin | malaysia-ai | "2024-04-19T04:33:54Z" | 0 | 1 | [
"language:zh",
"region:us"
] | null | "2024-04-09T07:41:19Z" | ---
language:
- zh
---
# Mandarin Youtube
Source code at https://github.com/mesolitica/malaysian-dataset/tree/master/speech/mandarin-youtube |
open-llm-leaderboard-old/details_4season__alignment-model-test10 | open-llm-leaderboard-old | "2024-04-09T08:14:48Z" | 0 | 1 | [
"region:us"
] | null | "2024-04-09T08:14:15Z" | ---
pretty_name: Evaluation run of 4season/alignment-model-test10
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [4season/alignment-model-test10](https://huggingface.co/4season/alignment-model-test10)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_4season__alignment-model-test10\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T08:12:37.264622](https://huggingface.co/datasets/open-llm-leaderboard/details_4season__alignment-model-test10/blob/main/results_2024-04-09T08-12-37.264622.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6830251707414831,\n\
\ \"acc_stderr\": 0.03150837150158549,\n \"acc_norm\": 0.6842989605978566,\n\
\ \"acc_norm_stderr\": 0.032158515186000075,\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.710829343782068,\n\
\ \"mc2_stderr\": 0.014802276642222825\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7738907849829352,\n \"acc_stderr\": 0.012224202097063276,\n\
\ \"acc_norm\": 0.7960750853242321,\n \"acc_norm_stderr\": 0.01177426247870226\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7762397928699463,\n\
\ \"acc_stderr\": 0.004159114679873824,\n \"acc_norm\": 0.9001194981079467,\n\
\ \"acc_norm_stderr\": 0.002992278134932447\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.033911609343436025,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.033911609343436025\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.72,\n\
\ \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n \
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724067,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724067\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8194444444444444,\n\
\ \"acc_stderr\": 0.03216600808802269,\n \"acc_norm\": 0.8194444444444444,\n\
\ \"acc_norm_stderr\": 0.03216600808802269\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.7052023121387283,\n\
\ \"acc_stderr\": 0.034765996075164785,\n \"acc_norm\": 0.7052023121387283,\n\
\ \"acc_norm_stderr\": 0.034765996075164785\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287533,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287533\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6425531914893617,\n \"acc_stderr\": 0.031329417894764254,\n\
\ \"acc_norm\": 0.6425531914893617,\n \"acc_norm_stderr\": 0.031329417894764254\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5526315789473685,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.5526315789473685,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6482758620689655,\n \"acc_stderr\": 0.0397923663749741,\n\
\ \"acc_norm\": 0.6482758620689655,\n \"acc_norm_stderr\": 0.0397923663749741\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5105820105820106,\n \"acc_stderr\": 0.02574554227604548,\n \"\
acc_norm\": 0.5105820105820106,\n \"acc_norm_stderr\": 0.02574554227604548\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8258064516129032,\n\
\ \"acc_stderr\": 0.021576248184514583,\n \"acc_norm\": 0.8258064516129032,\n\
\ \"acc_norm_stderr\": 0.021576248184514583\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6157635467980296,\n \"acc_stderr\": 0.0342239856565755,\n\
\ \"acc_norm\": 0.6157635467980296,\n \"acc_norm_stderr\": 0.0342239856565755\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\"\
: 0.76,\n \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8181818181818182,\n \"acc_stderr\": 0.030117688929503564,\n\
\ \"acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.030117688929503564\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8535353535353535,\n \"acc_stderr\": 0.025190921114603915,\n \"\
acc_norm\": 0.8535353535353535,\n \"acc_norm_stderr\": 0.025190921114603915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8756476683937824,\n \"acc_stderr\": 0.023814477086593542,\n\
\ \"acc_norm\": 0.8756476683937824,\n \"acc_norm_stderr\": 0.023814477086593542\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6923076923076923,\n \"acc_stderr\": 0.023400928918310485,\n\
\ \"acc_norm\": 0.6923076923076923,\n \"acc_norm_stderr\": 0.023400928918310485\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.02904560029061626,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.02904560029061626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.773109243697479,\n \"acc_stderr\": 0.02720537153827948,\n \
\ \"acc_norm\": 0.773109243697479,\n \"acc_norm_stderr\": 0.02720537153827948\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.48344370860927155,\n \"acc_stderr\": 0.0408024418562897,\n \"\
acc_norm\": 0.48344370860927155,\n \"acc_norm_stderr\": 0.0408024418562897\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8715596330275229,\n \"acc_stderr\": 0.014344977542914318,\n \"\
acc_norm\": 0.8715596330275229,\n \"acc_norm_stderr\": 0.014344977542914318\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5648148148148148,\n \"acc_stderr\": 0.033812000056435254,\n \"\
acc_norm\": 0.5648148148148148,\n \"acc_norm_stderr\": 0.033812000056435254\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8578431372549019,\n \"acc_stderr\": 0.024509803921568627,\n \"\
acc_norm\": 0.8578431372549019,\n \"acc_norm_stderr\": 0.024509803921568627\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632446,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632446\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7623318385650224,\n\
\ \"acc_stderr\": 0.028568079464714274,\n \"acc_norm\": 0.7623318385650224,\n\
\ \"acc_norm_stderr\": 0.028568079464714274\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6641221374045801,\n \"acc_stderr\": 0.041423137719966634,\n\
\ \"acc_norm\": 0.6641221374045801,\n \"acc_norm_stderr\": 0.041423137719966634\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026622,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026622\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8931623931623932,\n\
\ \"acc_stderr\": 0.02023714900899093,\n \"acc_norm\": 0.8931623931623932,\n\
\ \"acc_norm_stderr\": 0.02023714900899093\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579825,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579825\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4558659217877095,\n\
\ \"acc_stderr\": 0.016657229424586306,\n \"acc_norm\": 0.4558659217877095,\n\
\ \"acc_norm_stderr\": 0.016657229424586306\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.02380518652488814,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02380518652488814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7459807073954984,\n\
\ \"acc_stderr\": 0.0247238615047717,\n \"acc_norm\": 0.7459807073954984,\n\
\ \"acc_norm_stderr\": 0.0247238615047717\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.023132376234543346,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.023132376234543346\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5354609929078015,\n \"acc_stderr\": 0.02975238965742705,\n \
\ \"acc_norm\": 0.5354609929078015,\n \"acc_norm_stderr\": 0.02975238965742705\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4973924380704042,\n\
\ \"acc_stderr\": 0.012770062445433172,\n \"acc_norm\": 0.4973924380704042,\n\
\ \"acc_norm_stderr\": 0.012770062445433172\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.01877168389352817,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.01877168389352817\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.763265306122449,\n \"acc_stderr\": 0.02721283588407316,\n\
\ \"acc_norm\": 0.763265306122449,\n \"acc_norm_stderr\": 0.02721283588407316\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.025196929874827072,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.025196929874827072\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5691554467564259,\n\
\ \"mc1_stderr\": 0.01733527247533237,\n \"mc2\": 0.710829343782068,\n\
\ \"mc2_stderr\": 0.014802276642222825\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8721389108129439,\n \"acc_stderr\": 0.009385235583937262\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5648218347232752,\n \
\ \"acc_stderr\": 0.013656253875470738\n }\n}\n```"
repo_url: https://huggingface.co/4season/alignment-model-test10
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-09.669210.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-12-37.264622.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- '**/details_harness|winogrande|5_2024-04-09T08-12-09.669210.parquet'
- split: 2024_04_09T08_12_37.264622
path:
- '**/details_harness|winogrande|5_2024-04-09T08-12-37.264622.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T08-12-37.264622.parquet'
- config_name: results
data_files:
- split: 2024_04_09T08_12_09.669210
path:
- results_2024-04-09T08-12-09.669210.parquet
- split: 2024_04_09T08_12_37.264622
path:
- results_2024-04-09T08-12-37.264622.parquet
- split: latest
path:
- results_2024-04-09T08-12-37.264622.parquet
---
# Dataset Card for Evaluation run of 4season/alignment-model-test10
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [4season/alignment-model-test10](https://huggingface.co/4season/alignment-model-test10) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_4season__alignment-model-test10",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T08:12:37.264622](https://huggingface.co/datasets/open-llm-leaderboard/details_4season__alignment-model-test10/blob/main/results_2024-04-09T08-12-37.264622.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6830251707414831,
"acc_stderr": 0.03150837150158549,
"acc_norm": 0.6842989605978566,
"acc_norm_stderr": 0.032158515186000075,
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.710829343782068,
"mc2_stderr": 0.014802276642222825
},
"harness|arc:challenge|25": {
"acc": 0.7738907849829352,
"acc_stderr": 0.012224202097063276,
"acc_norm": 0.7960750853242321,
"acc_norm_stderr": 0.01177426247870226
},
"harness|hellaswag|10": {
"acc": 0.7762397928699463,
"acc_stderr": 0.004159114679873824,
"acc_norm": 0.9001194981079467,
"acc_norm_stderr": 0.002992278134932447
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.033911609343436025,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.033911609343436025
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724067,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724067
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8194444444444444,
"acc_stderr": 0.03216600808802269,
"acc_norm": 0.8194444444444444,
"acc_norm_stderr": 0.03216600808802269
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.7052023121387283,
"acc_stderr": 0.034765996075164785,
"acc_norm": 0.7052023121387283,
"acc_norm_stderr": 0.034765996075164785
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287533,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287533
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6425531914893617,
"acc_stderr": 0.031329417894764254,
"acc_norm": 0.6425531914893617,
"acc_norm_stderr": 0.031329417894764254
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6482758620689655,
"acc_stderr": 0.0397923663749741,
"acc_norm": 0.6482758620689655,
"acc_norm_stderr": 0.0397923663749741
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5105820105820106,
"acc_stderr": 0.02574554227604548,
"acc_norm": 0.5105820105820106,
"acc_norm_stderr": 0.02574554227604548
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8258064516129032,
"acc_stderr": 0.021576248184514583,
"acc_norm": 0.8258064516129032,
"acc_norm_stderr": 0.021576248184514583
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6157635467980296,
"acc_stderr": 0.0342239856565755,
"acc_norm": 0.6157635467980296,
"acc_norm_stderr": 0.0342239856565755
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.030117688929503564,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.030117688929503564
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8535353535353535,
"acc_stderr": 0.025190921114603915,
"acc_norm": 0.8535353535353535,
"acc_norm_stderr": 0.025190921114603915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8756476683937824,
"acc_stderr": 0.023814477086593542,
"acc_norm": 0.8756476683937824,
"acc_norm_stderr": 0.023814477086593542
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6923076923076923,
"acc_stderr": 0.023400928918310485,
"acc_norm": 0.6923076923076923,
"acc_norm_stderr": 0.023400928918310485
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.02904560029061626,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.02904560029061626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.773109243697479,
"acc_stderr": 0.02720537153827948,
"acc_norm": 0.773109243697479,
"acc_norm_stderr": 0.02720537153827948
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.48344370860927155,
"acc_stderr": 0.0408024418562897,
"acc_norm": 0.48344370860927155,
"acc_norm_stderr": 0.0408024418562897
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8715596330275229,
"acc_stderr": 0.014344977542914318,
"acc_norm": 0.8715596330275229,
"acc_norm_stderr": 0.014344977542914318
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5648148148148148,
"acc_stderr": 0.033812000056435254,
"acc_norm": 0.5648148148148148,
"acc_norm_stderr": 0.033812000056435254
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8578431372549019,
"acc_stderr": 0.024509803921568627,
"acc_norm": 0.8578431372549019,
"acc_norm_stderr": 0.024509803921568627
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632446,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632446
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7623318385650224,
"acc_stderr": 0.028568079464714274,
"acc_norm": 0.7623318385650224,
"acc_norm_stderr": 0.028568079464714274
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6641221374045801,
"acc_stderr": 0.041423137719966634,
"acc_norm": 0.6641221374045801,
"acc_norm_stderr": 0.041423137719966634
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026622,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026622
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8931623931623932,
"acc_stderr": 0.02023714900899093,
"acc_norm": 0.8931623931623932,
"acc_norm_stderr": 0.02023714900899093
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579825,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579825
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4558659217877095,
"acc_stderr": 0.016657229424586306,
"acc_norm": 0.4558659217877095,
"acc_norm_stderr": 0.016657229424586306
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02380518652488814,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02380518652488814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7459807073954984,
"acc_stderr": 0.0247238615047717,
"acc_norm": 0.7459807073954984,
"acc_norm_stderr": 0.0247238615047717
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.023132376234543346,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.023132376234543346
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5354609929078015,
"acc_stderr": 0.02975238965742705,
"acc_norm": 0.5354609929078015,
"acc_norm_stderr": 0.02975238965742705
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4973924380704042,
"acc_stderr": 0.012770062445433172,
"acc_norm": 0.4973924380704042,
"acc_norm_stderr": 0.012770062445433172
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.01877168389352817,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.01877168389352817
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.763265306122449,
"acc_stderr": 0.02721283588407316,
"acc_norm": 0.763265306122449,
"acc_norm_stderr": 0.02721283588407316
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.025196929874827072,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.025196929874827072
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5691554467564259,
"mc1_stderr": 0.01733527247533237,
"mc2": 0.710829343782068,
"mc2_stderr": 0.014802276642222825
},
"harness|winogrande|5": {
"acc": 0.8721389108129439,
"acc_stderr": 0.009385235583937262
},
"harness|gsm8k|5": {
"acc": 0.5648218347232752,
"acc_stderr": 0.013656253875470738
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_raincandy-u__Quark-464M-v0.2 | open-llm-leaderboard-old | "2024-04-09T08:41:40Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T08:41:05Z" | ---
pretty_name: Evaluation run of raincandy-u/Quark-464M-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [raincandy-u/Quark-464M-v0.2](https://huggingface.co/raincandy-u/Quark-464M-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_raincandy-u__Quark-464M-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T08:39:02.304022](https://huggingface.co/datasets/open-llm-leaderboard/details_raincandy-u__Quark-464M-v0.2/blob/main/results_2024-04-09T08-39-02.304022.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3125584120923437,\n\
\ \"acc_stderr\": 0.032810288522301126,\n \"acc_norm\": 0.31507177497570615,\n\
\ \"acc_norm_stderr\": 0.03360115960632853,\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4388966108457062,\n\
\ \"mc2_stderr\": 0.015289903733660282\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.26621160409556316,\n \"acc_stderr\": 0.012915774781523202,\n\
\ \"acc_norm\": 0.3046075085324232,\n \"acc_norm_stderr\": 0.013449522109932487\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.36367257518422624,\n\
\ \"acc_stderr\": 0.004800728138792369,\n \"acc_norm\": 0.44961163114917346,\n\
\ \"acc_norm_stderr\": 0.004964378762425237\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.35555555555555557,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.35555555555555557,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.26973684210526316,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.26973684210526316,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3018867924528302,\n \"acc_stderr\": 0.028254200344438665,\n\
\ \"acc_norm\": 0.3018867924528302,\n \"acc_norm_stderr\": 0.028254200344438665\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.033687629322594295,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.033687629322594295\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.28936170212765955,\n \"acc_stderr\": 0.029644006577009618,\n\
\ \"acc_norm\": 0.28936170212765955,\n \"acc_norm_stderr\": 0.029644006577009618\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436695,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436695\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.3310344827586207,\n \"acc_stderr\": 0.03921545312467122,\n\
\ \"acc_norm\": 0.3310344827586207,\n \"acc_norm_stderr\": 0.03921545312467122\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24074074074074073,\n \"acc_stderr\": 0.022019080012217897,\n \"\
acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.022019080012217897\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.3419354838709677,\n\
\ \"acc_stderr\": 0.026985289576552735,\n \"acc_norm\": 0.3419354838709677,\n\
\ \"acc_norm_stderr\": 0.026985289576552735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3054187192118227,\n \"acc_stderr\": 0.03240661565868408,\n\
\ \"acc_norm\": 0.3054187192118227,\n \"acc_norm_stderr\": 0.03240661565868408\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \"acc_norm\"\
: 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.3151515151515151,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.3151515151515151,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.033184773338453315,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.033184773338453315\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.32124352331606215,\n \"acc_stderr\": 0.033699508685490674,\n\
\ \"acc_norm\": 0.32124352331606215,\n \"acc_norm_stderr\": 0.033699508685490674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2717948717948718,\n \"acc_stderr\": 0.022556551010132368,\n\
\ \"acc_norm\": 0.2717948717948718,\n \"acc_norm_stderr\": 0.022556551010132368\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.23703703703703705,\n \"acc_stderr\": 0.025928876132766097,\n \
\ \"acc_norm\": 0.23703703703703705,\n \"acc_norm_stderr\": 0.025928876132766097\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886838,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886838\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2251655629139073,\n \"acc_stderr\": 0.03410435282008936,\n \"\
acc_norm\": 0.2251655629139073,\n \"acc_norm_stderr\": 0.03410435282008936\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.3743119266055046,\n \"acc_stderr\": 0.02074895940898832,\n \"\
acc_norm\": 0.3743119266055046,\n \"acc_norm_stderr\": 0.02074895940898832\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257013,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257013\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.3088235294117647,\n \"acc_stderr\": 0.03242661719827218,\n \"\
acc_norm\": 0.3088235294117647,\n \"acc_norm_stderr\": 0.03242661719827218\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.34177215189873417,\n \"acc_stderr\": 0.030874537537553617,\n \
\ \"acc_norm\": 0.34177215189873417,\n \"acc_norm_stderr\": 0.030874537537553617\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.43946188340807174,\n\
\ \"acc_stderr\": 0.03331092511038179,\n \"acc_norm\": 0.43946188340807174,\n\
\ \"acc_norm_stderr\": 0.03331092511038179\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.37404580152671757,\n \"acc_stderr\": 0.04243869242230523,\n\
\ \"acc_norm\": 0.37404580152671757,\n \"acc_norm_stderr\": 0.04243869242230523\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.4297520661157025,\n \"acc_stderr\": 0.04519082021319773,\n \"\
acc_norm\": 0.4297520661157025,\n \"acc_norm_stderr\": 0.04519082021319773\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.0471282125742677,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.0471282125742677\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.32515337423312884,\n \"acc_stderr\": 0.03680350371286461,\n\
\ \"acc_norm\": 0.32515337423312884,\n \"acc_norm_stderr\": 0.03680350371286461\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.045416094465039476,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.045416094465039476\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4188034188034188,\n\
\ \"acc_stderr\": 0.03232128912157792,\n \"acc_norm\": 0.4188034188034188,\n\
\ \"acc_norm_stderr\": 0.03232128912157792\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.36015325670498083,\n\
\ \"acc_stderr\": 0.017166362471369306,\n \"acc_norm\": 0.36015325670498083,\n\
\ \"acc_norm_stderr\": 0.017166362471369306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.34971098265895956,\n \"acc_stderr\": 0.025674281456531025,\n\
\ \"acc_norm\": 0.34971098265895956,\n \"acc_norm_stderr\": 0.025674281456531025\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23910614525139665,\n\
\ \"acc_stderr\": 0.014265554192331144,\n \"acc_norm\": 0.23910614525139665,\n\
\ \"acc_norm_stderr\": 0.014265554192331144\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.026568921015457155,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.026568921015457155\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3279742765273312,\n\
\ \"acc_stderr\": 0.02666441088693761,\n \"acc_norm\": 0.3279742765273312,\n\
\ \"acc_norm_stderr\": 0.02666441088693761\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.33641975308641975,\n \"acc_stderr\": 0.026289734945952922,\n\
\ \"acc_norm\": 0.33641975308641975,\n \"acc_norm_stderr\": 0.026289734945952922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2730496453900709,\n \"acc_stderr\": 0.026577860943307854,\n \
\ \"acc_norm\": 0.2730496453900709,\n \"acc_norm_stderr\": 0.026577860943307854\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2920469361147327,\n\
\ \"acc_stderr\": 0.011613349136271824,\n \"acc_norm\": 0.2920469361147327,\n\
\ \"acc_norm_stderr\": 0.011613349136271824\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2977941176470588,\n \"acc_stderr\": 0.02777829870154544,\n\
\ \"acc_norm\": 0.2977941176470588,\n \"acc_norm_stderr\": 0.02777829870154544\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3137254901960784,\n \"acc_stderr\": 0.018771683893528183,\n \
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.018771683893528183\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2612244897959184,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.2612244897959184,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.39303482587064675,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.39303482587064675,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.27485380116959063,\n \"acc_stderr\": 0.034240429246915824,\n\
\ \"acc_norm\": 0.27485380116959063,\n \"acc_norm_stderr\": 0.034240429246915824\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2827417380660955,\n\
\ \"mc1_stderr\": 0.015764770836777308,\n \"mc2\": 0.4388966108457062,\n\
\ \"mc2_stderr\": 0.015289903733660282\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5564325177584846,\n \"acc_stderr\": 0.013962694907620404\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.04473085670962851,\n \
\ \"acc_stderr\": 0.005693886131407054\n }\n}\n```"
repo_url: https://huggingface.co/raincandy-u/Quark-464M-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-39-02.304022.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T08-39-02.304022.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- '**/details_harness|winogrande|5_2024-04-09T08-39-02.304022.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T08-39-02.304022.parquet'
- config_name: results
data_files:
- split: 2024_04_09T08_39_02.304022
path:
- results_2024-04-09T08-39-02.304022.parquet
- split: latest
path:
- results_2024-04-09T08-39-02.304022.parquet
---
# Dataset Card for Evaluation run of raincandy-u/Quark-464M-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [raincandy-u/Quark-464M-v0.2](https://huggingface.co/raincandy-u/Quark-464M-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_raincandy-u__Quark-464M-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T08:39:02.304022](https://huggingface.co/datasets/open-llm-leaderboard/details_raincandy-u__Quark-464M-v0.2/blob/main/results_2024-04-09T08-39-02.304022.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3125584120923437,
"acc_stderr": 0.032810288522301126,
"acc_norm": 0.31507177497570615,
"acc_norm_stderr": 0.03360115960632853,
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4388966108457062,
"mc2_stderr": 0.015289903733660282
},
"harness|arc:challenge|25": {
"acc": 0.26621160409556316,
"acc_stderr": 0.012915774781523202,
"acc_norm": 0.3046075085324232,
"acc_norm_stderr": 0.013449522109932487
},
"harness|hellaswag|10": {
"acc": 0.36367257518422624,
"acc_stderr": 0.004800728138792369,
"acc_norm": 0.44961163114917346,
"acc_norm_stderr": 0.004964378762425237
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.35555555555555557,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.35555555555555557,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.26973684210526316,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.26973684210526316,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3018867924528302,
"acc_stderr": 0.028254200344438665,
"acc_norm": 0.3018867924528302,
"acc_norm_stderr": 0.028254200344438665
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.033687629322594295,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.033687629322594295
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.28936170212765955,
"acc_stderr": 0.029644006577009618,
"acc_norm": 0.28936170212765955,
"acc_norm_stderr": 0.029644006577009618
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436695,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436695
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.3310344827586207,
"acc_stderr": 0.03921545312467122,
"acc_norm": 0.3310344827586207,
"acc_norm_stderr": 0.03921545312467122
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.022019080012217897,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.022019080012217897
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3419354838709677,
"acc_stderr": 0.026985289576552735,
"acc_norm": 0.3419354838709677,
"acc_norm_stderr": 0.026985289576552735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3054187192118227,
"acc_stderr": 0.03240661565868408,
"acc_norm": 0.3054187192118227,
"acc_norm_stderr": 0.03240661565868408
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.3151515151515151,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.3151515151515151,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.033184773338453315,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.033184773338453315
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.32124352331606215,
"acc_stderr": 0.033699508685490674,
"acc_norm": 0.32124352331606215,
"acc_norm_stderr": 0.033699508685490674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2717948717948718,
"acc_stderr": 0.022556551010132368,
"acc_norm": 0.2717948717948718,
"acc_norm_stderr": 0.022556551010132368
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.025928876132766097,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.025928876132766097
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886838,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886838
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2251655629139073,
"acc_stderr": 0.03410435282008936,
"acc_norm": 0.2251655629139073,
"acc_norm_stderr": 0.03410435282008936
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.3743119266055046,
"acc_stderr": 0.02074895940898832,
"acc_norm": 0.3743119266055046,
"acc_norm_stderr": 0.02074895940898832
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257013,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257013
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.3088235294117647,
"acc_stderr": 0.03242661719827218,
"acc_norm": 0.3088235294117647,
"acc_norm_stderr": 0.03242661719827218
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.34177215189873417,
"acc_stderr": 0.030874537537553617,
"acc_norm": 0.34177215189873417,
"acc_norm_stderr": 0.030874537537553617
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.43946188340807174,
"acc_stderr": 0.03331092511038179,
"acc_norm": 0.43946188340807174,
"acc_norm_stderr": 0.03331092511038179
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.37404580152671757,
"acc_stderr": 0.04243869242230523,
"acc_norm": 0.37404580152671757,
"acc_norm_stderr": 0.04243869242230523
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.4297520661157025,
"acc_stderr": 0.04519082021319773,
"acc_norm": 0.4297520661157025,
"acc_norm_stderr": 0.04519082021319773
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.0471282125742677,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.0471282125742677
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.32515337423312884,
"acc_stderr": 0.03680350371286461,
"acc_norm": 0.32515337423312884,
"acc_norm_stderr": 0.03680350371286461
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.045416094465039476,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.045416094465039476
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4188034188034188,
"acc_stderr": 0.03232128912157792,
"acc_norm": 0.4188034188034188,
"acc_norm_stderr": 0.03232128912157792
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.36015325670498083,
"acc_stderr": 0.017166362471369306,
"acc_norm": 0.36015325670498083,
"acc_norm_stderr": 0.017166362471369306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.34971098265895956,
"acc_stderr": 0.025674281456531025,
"acc_norm": 0.34971098265895956,
"acc_norm_stderr": 0.025674281456531025
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23910614525139665,
"acc_stderr": 0.014265554192331144,
"acc_norm": 0.23910614525139665,
"acc_norm_stderr": 0.014265554192331144
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.026568921015457155,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.026568921015457155
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3279742765273312,
"acc_stderr": 0.02666441088693761,
"acc_norm": 0.3279742765273312,
"acc_norm_stderr": 0.02666441088693761
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.33641975308641975,
"acc_stderr": 0.026289734945952922,
"acc_norm": 0.33641975308641975,
"acc_norm_stderr": 0.026289734945952922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2730496453900709,
"acc_stderr": 0.026577860943307854,
"acc_norm": 0.2730496453900709,
"acc_norm_stderr": 0.026577860943307854
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2920469361147327,
"acc_stderr": 0.011613349136271824,
"acc_norm": 0.2920469361147327,
"acc_norm_stderr": 0.011613349136271824
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2977941176470588,
"acc_stderr": 0.02777829870154544,
"acc_norm": 0.2977941176470588,
"acc_norm_stderr": 0.02777829870154544
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.018771683893528183,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.018771683893528183
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2612244897959184,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.2612244897959184,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.39303482587064675,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.39303482587064675,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.27485380116959063,
"acc_stderr": 0.034240429246915824,
"acc_norm": 0.27485380116959063,
"acc_norm_stderr": 0.034240429246915824
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2827417380660955,
"mc1_stderr": 0.015764770836777308,
"mc2": 0.4388966108457062,
"mc2_stderr": 0.015289903733660282
},
"harness|winogrande|5": {
"acc": 0.5564325177584846,
"acc_stderr": 0.013962694907620404
},
"harness|gsm8k|5": {
"acc": 0.04473085670962851,
"acc_stderr": 0.005693886131407054
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SJTU-TES/Fake-Detect | SJTU-TES | "2024-04-09T08:59:00Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T08:58:14Z" | ---
license: apache-2.0
---
|
anti-ai/Vietnamese_MsMacro | anti-ai | "2024-04-28T03:32:58Z" | 0 | 0 | [
"task_categories:text-retrieval",
"task_categories:text-classification",
"task_categories:sentence-similarity",
"language:vi",
"license:apache-2.0",
"size_categories:10M<n<100M",
"arxiv:2108.13897",
"region:us"
] | [
"text-retrieval",
"text-classification",
"sentence-similarity"
] | "2024-04-09T09:02:40Z" | ---
license: apache-2.0
task_categories:
- text-retrieval
- text-classification
- sentence-similarity
language:
- vi
size_categories:
- 10M<n<100M
extra_gated_fields:
Name: text
Company/Organization: text
E-Mail: text
configs:
- config_name: data
---
### Licensing Information
This dataset is released under [Apache license 2.0](https://www.apache.org/licenses/).
# Citation Information
```
@article{DBLP:journals/corr/abs-2108-13897,
author = {Luiz Bonifacio and
Israel Campiotti and
Roberto de Alencar Lotufo and
Rodrigo Frassetto Nogueira},
title = {mMARCO: {A} Multilingual Version of {MS} {MARCO} Passage Ranking Dataset},
journal = {CoRR},
volume = {abs/2108.13897},
year = {2021},
url = {https://arxiv.org/abs/2108.13897},
eprinttype = {arXiv},
eprint = {2108.13897},
timestamp = {Mon, 20 Mar 2023 15:35:34 +0100},
biburl = {https://dblp.org/rec/journals/corr/abs-2108-13897.bib},
bibsource = {dblp computer science bibliography, https://dblp.org}
}
``` |
wltjr1007/cifar100_clip | wltjr1007 | "2024-04-09T09:13:26Z" | 0 | 0 | [
"task_categories:image-classification",
"annotations_creators:crowdsourced",
"language_creators:found",
"multilinguality:monolingual",
"source_datasets:extended|other-80-Million-Tiny-Images",
"language:en",
"license:unknown",
"size_categories:10K<n<100K",
"region:us"
] | [
"image-classification"
] | "2024-04-09T09:11:11Z" | ---
annotations_creators:
- crowdsourced
language_creators:
- found
language:
- en
license:
- unknown
multilinguality:
- monolingual
size_categories:
- 10K<n<100K
source_datasets:
- extended|other-80-Million-Tiny-Images
task_categories:
- image-classification
task_ids: []
paperswithcode_id: cifar-100
pretty_name: Cifar100
dataset_info:
config_name: cifar100
features:
- name: img
dtype: image
- name: fine_label
dtype:
class_label:
names:
'0': apple
'1': aquarium_fish
'2': baby
'3': bear
'4': beaver
'5': bed
'6': bee
'7': beetle
'8': bicycle
'9': bottle
'10': bowl
'11': boy
'12': bridge
'13': bus
'14': butterfly
'15': camel
'16': can
'17': castle
'18': caterpillar
'19': cattle
'20': chair
'21': chimpanzee
'22': clock
'23': cloud
'24': cockroach
'25': couch
'26': cra
'27': crocodile
'28': cup
'29': dinosaur
'30': dolphin
'31': elephant
'32': flatfish
'33': forest
'34': fox
'35': girl
'36': hamster
'37': house
'38': kangaroo
'39': keyboard
'40': lamp
'41': lawn_mower
'42': leopard
'43': lion
'44': lizard
'45': lobster
'46': man
'47': maple_tree
'48': motorcycle
'49': mountain
'50': mouse
'51': mushroom
'52': oak_tree
'53': orange
'54': orchid
'55': otter
'56': palm_tree
'57': pear
'58': pickup_truck
'59': pine_tree
'60': plain
'61': plate
'62': poppy
'63': porcupine
'64': possum
'65': rabbit
'66': raccoon
'67': ray
'68': road
'69': rocket
'70': rose
'71': sea
'72': seal
'73': shark
'74': shrew
'75': skunk
'76': skyscraper
'77': snail
'78': snake
'79': spider
'80': squirrel
'81': streetcar
'82': sunflower
'83': sweet_pepper
'84': table
'85': tank
'86': telephone
'87': television
'88': tiger
'89': tractor
'90': train
'91': trout
'92': tulip
'93': turtle
'94': wardrobe
'95': whale
'96': willow_tree
'97': wolf
'98': woman
'99': worm
- name: coarse_label
dtype:
class_label:
names:
'0': aquatic_mammals
'1': fish
'2': flowers
'3': food_containers
'4': fruit_and_vegetables
'5': household_electrical_devices
'6': household_furniture
'7': insects
'8': large_carnivores
'9': large_man-made_outdoor_things
'10': large_natural_outdoor_scenes
'11': large_omnivores_and_herbivores
'12': medium_mammals
'13': non-insect_invertebrates
'14': people
'15': reptiles
'16': small_mammals
'17': trees
'18': vehicles_1
'19': vehicles_2
--- |
Danielfu17/chunked_guanzhangtone | Danielfu17 | "2024-06-08T08:23:31Z" | 0 | 0 | [
"license:unknown",
"region:us"
] | null | "2024-04-09T09:16:16Z" | ---
license: unknown
---
|
MinhMinh09/dictionary-20240409 | MinhMinh09 | "2024-04-11T06:56:29Z" | 0 | 0 | [
"language:vi",
"language:en",
"license:mit",
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T09:32:16Z" | ---
language:
- vi
- en
license: mit
---
|
qiushanhku/discipline | qiushanhku | "2024-04-09T10:12:42Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-09T09:38:53Z" | ---
license: apache-2.0
---
|
upendrawappgo/guanaco-llama2-1k | upendrawappgo | "2024-04-09T10:18:39Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T10:18:35Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1653068
num_examples: 1000
download_size: 966647
dataset_size: 1653068
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_MaziyarPanahi__Experiment26Yam_Ognoexperiment27Multi_verse_model | open-llm-leaderboard-old | "2024-04-09T10:30:22Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T10:30:01Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Experiment26Yam_Ognoexperiment27Multi_verse_model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Experiment26Yam_Ognoexperiment27Multi_verse_model](https://huggingface.co/MaziyarPanahi/Experiment26Yam_Ognoexperiment27Multi_verse_model)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yam_Ognoexperiment27Multi_verse_model\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T10:27:40.349773](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yam_Ognoexperiment27Multi_verse_model/blob/main/results_2024-04-09T10-27-40.349773.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6501353349169147,\n\
\ \"acc_stderr\": 0.03210199365277827,\n \"acc_norm\": 0.6491186479274945,\n\
\ \"acc_norm_stderr\": 0.03277884429890129,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7811688544577092,\n\
\ \"mc2_stderr\": 0.013668825281350112\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7133105802047781,\n \"acc_stderr\": 0.013214986329274776,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7187811192989444,\n\
\ \"acc_stderr\": 0.0044867522004303495,\n \"acc_norm\": 0.8922525393347939,\n\
\ \"acc_norm_stderr\": 0.003094275186361527\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.032400380867927465,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.032400380867927465\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723292,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723292\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4312849162011173,\n\
\ \"acc_stderr\": 0.016563829399047703,\n \"acc_norm\": 0.4312849162011173,\n\
\ \"acc_norm_stderr\": 0.016563829399047703\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523369,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523369\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142783,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142783\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233264,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233264\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7811688544577092,\n\
\ \"mc2_stderr\": 0.013668825281350112\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627295\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6982562547384382,\n \
\ \"acc_stderr\": 0.012643544762873354\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Experiment26Yam_Ognoexperiment27Multi_verse_model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-40.349773.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-40.349773.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-40.349773.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-40.349773.parquet'
- config_name: results
data_files:
- split: 2024_04_09T10_27_40.349773
path:
- results_2024-04-09T10-27-40.349773.parquet
- split: latest
path:
- results_2024-04-09T10-27-40.349773.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Experiment26Yam_Ognoexperiment27Multi_verse_model
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Experiment26Yam_Ognoexperiment27Multi_verse_model](https://huggingface.co/MaziyarPanahi/Experiment26Yam_Ognoexperiment27Multi_verse_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yam_Ognoexperiment27Multi_verse_model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T10:27:40.349773](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yam_Ognoexperiment27Multi_verse_model/blob/main/results_2024-04-09T10-27-40.349773.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6501353349169147,
"acc_stderr": 0.03210199365277827,
"acc_norm": 0.6491186479274945,
"acc_norm_stderr": 0.03277884429890129,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7811688544577092,
"mc2_stderr": 0.013668825281350112
},
"harness|arc:challenge|25": {
"acc": 0.7133105802047781,
"acc_stderr": 0.013214986329274776,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7187811192989444,
"acc_stderr": 0.0044867522004303495,
"acc_norm": 0.8922525393347939,
"acc_norm_stderr": 0.003094275186361527
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.032400380867927465,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.032400380867927465
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723292,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723292
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4312849162011173,
"acc_stderr": 0.016563829399047703,
"acc_norm": 0.4312849162011173,
"acc_norm_stderr": 0.016563829399047703
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523369,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523369
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142783,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142783
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233264,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233264
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7811688544577092,
"mc2_stderr": 0.013668825281350112
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627295
},
"harness|gsm8k|5": {
"acc": 0.6982562547384382,
"acc_stderr": 0.012643544762873354
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q | open-llm-leaderboard-old | "2024-04-09T10:30:24Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T10:30:03Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/YamshadowInex12_Experiment26T3q
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/YamshadowInex12_Experiment26T3q](https://huggingface.co/MaziyarPanahi/YamshadowInex12_Experiment26T3q)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T10:27:46.019757](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q/blob/main/results_2024-04-09T10-27-46.019757.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6510455186561941,\n\
\ \"acc_stderr\": 0.032057149961613414,\n \"acc_norm\": 0.6501342340668828,\n\
\ \"acc_norm_stderr\": 0.03273135985814274,\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720134,\n \"mc2\": 0.7835453305304184,\n\
\ \"mc2_stderr\": 0.01361341647369438\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7158703071672355,\n \"acc_stderr\": 0.013179442447653886,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7175861382194781,\n\
\ \"acc_stderr\": 0.004492535748097627,\n \"acc_norm\": 0.8925512846046604,\n\
\ \"acc_norm_stderr\": 0.003090499801090434\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322663,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322663\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7870967741935484,\n\
\ \"acc_stderr\": 0.02328766512726855,\n \"acc_norm\": 0.7870967741935484,\n\
\ \"acc_norm_stderr\": 0.02328766512726855\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993466,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993466\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.0165136760311796,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.0165136760311796\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.024659685185967287,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.024659685185967287\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146292,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146292\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6352509179926561,\n\
\ \"mc1_stderr\": 0.016850961061720134,\n \"mc2\": 0.7835453305304184,\n\
\ \"mc2_stderr\": 0.01361341647369438\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479674\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515427\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/YamshadowInex12_Experiment26T3q
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-27-46.019757.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-46.019757.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T10-27-46.019757.parquet'
- config_name: results
data_files:
- split: 2024_04_09T10_27_46.019757
path:
- results_2024-04-09T10-27-46.019757.parquet
- split: latest
path:
- results_2024-04-09T10-27-46.019757.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/YamshadowInex12_Experiment26T3q
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/YamshadowInex12_Experiment26T3q](https://huggingface.co/MaziyarPanahi/YamshadowInex12_Experiment26T3q) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T10:27:46.019757](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowInex12_Experiment26T3q/blob/main/results_2024-04-09T10-27-46.019757.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6510455186561941,
"acc_stderr": 0.032057149961613414,
"acc_norm": 0.6501342340668828,
"acc_norm_stderr": 0.03273135985814274,
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720134,
"mc2": 0.7835453305304184,
"mc2_stderr": 0.01361341647369438
},
"harness|arc:challenge|25": {
"acc": 0.7158703071672355,
"acc_stderr": 0.013179442447653886,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7175861382194781,
"acc_stderr": 0.004492535748097627,
"acc_norm": 0.8925512846046604,
"acc_norm_stderr": 0.003090499801090434
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322663,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322663
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993466,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993466
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.0165136760311796,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.0165136760311796
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.024659685185967287,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.024659685185967287
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146292,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146292
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6352509179926561,
"mc1_stderr": 0.016850961061720134,
"mc2": 0.7835453305304184,
"mc2_stderr": 0.01361341647369438
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479674
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515427
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_MaziyarPanahi__Experiment26Yamshadow_Ognoexperiment27Multi_verse_model | open-llm-leaderboard-old | "2024-04-09T10:31:17Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T10:30:56Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Experiment26Yamshadow_Ognoexperiment27Multi_verse_model
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Experiment26Yamshadow_Ognoexperiment27Multi_verse_model](https://huggingface.co/MaziyarPanahi/Experiment26Yamshadow_Ognoexperiment27Multi_verse_model)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yamshadow_Ognoexperiment27Multi_verse_model\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T10:28:30.332722](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yamshadow_Ognoexperiment27Multi_verse_model/blob/main/results_2024-04-09T10-28-30.332722.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6514120387541046,\n\
\ \"acc_stderr\": 0.03208111349221554,\n \"acc_norm\": 0.6505274120269638,\n\
\ \"acc_norm_stderr\": 0.03275583108220625,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7810356492566133,\n\
\ \"mc2_stderr\": 0.013678421564491373\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838793,\n\
\ \"acc_norm\": 0.7320819112627986,\n \"acc_norm_stderr\": 0.012942030195136438\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7172873929496116,\n\
\ \"acc_stderr\": 0.0044939755273867375,\n \"acc_norm\": 0.8919537940649273,\n\
\ \"acc_norm_stderr\": 0.0030980431017758447\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.04697085136647863,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.04697085136647863\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778398,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778398\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.039955240076816806,\n \"\
acc_norm\": 0.3973509933774834,\n \"acc_norm_stderr\": 0.039955240076816806\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42905027932960893,\n\
\ \"acc_stderr\": 0.01655328786311604,\n \"acc_norm\": 0.42905027932960893,\n\
\ \"acc_norm_stderr\": 0.01655328786311604\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.025646863097137897,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.025646863097137897\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7345679012345679,\n \"acc_stderr\": 0.024569223600460842,\n\
\ \"acc_norm\": 0.7345679012345679,\n \"acc_norm_stderr\": 0.024569223600460842\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4771838331160365,\n\
\ \"acc_stderr\": 0.012756933382823698,\n \"acc_norm\": 0.4771838331160365,\n\
\ \"acc_norm_stderr\": 0.012756933382823698\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6797385620915033,\n \"acc_stderr\": 0.018875682938069443,\n \
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.018875682938069443\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7810356492566133,\n\
\ \"mc2_stderr\": 0.013678421564491373\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8492501973164956,\n \"acc_stderr\": 0.010056094631479686\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6974981046247157,\n \
\ \"acc_stderr\": 0.012652544133186141\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Experiment26Yamshadow_Ognoexperiment27Multi_verse_model
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-30.332722.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-28-30.332722.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- '**/details_harness|winogrande|5_2024-04-09T10-28-30.332722.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T10-28-30.332722.parquet'
- config_name: results
data_files:
- split: 2024_04_09T10_28_30.332722
path:
- results_2024-04-09T10-28-30.332722.parquet
- split: latest
path:
- results_2024-04-09T10-28-30.332722.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Experiment26Yamshadow_Ognoexperiment27Multi_verse_model
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Experiment26Yamshadow_Ognoexperiment27Multi_verse_model](https://huggingface.co/MaziyarPanahi/Experiment26Yamshadow_Ognoexperiment27Multi_verse_model) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yamshadow_Ognoexperiment27Multi_verse_model",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T10:28:30.332722](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Experiment26Yamshadow_Ognoexperiment27Multi_verse_model/blob/main/results_2024-04-09T10-28-30.332722.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6514120387541046,
"acc_stderr": 0.03208111349221554,
"acc_norm": 0.6505274120269638,
"acc_norm_stderr": 0.03275583108220625,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7810356492566133,
"mc2_stderr": 0.013678421564491373
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838793,
"acc_norm": 0.7320819112627986,
"acc_norm_stderr": 0.012942030195136438
},
"harness|hellaswag|10": {
"acc": 0.7172873929496116,
"acc_stderr": 0.0044939755273867375,
"acc_norm": 0.8919537940649273,
"acc_norm_stderr": 0.0030980431017758447
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04697085136647863,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04697085136647863
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778398,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778398
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.039955240076816806,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.039955240076816806
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42905027932960893,
"acc_stderr": 0.01655328786311604,
"acc_norm": 0.42905027932960893,
"acc_norm_stderr": 0.01655328786311604
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.025646863097137897,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.025646863097137897
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7345679012345679,
"acc_stderr": 0.024569223600460842,
"acc_norm": 0.7345679012345679,
"acc_norm_stderr": 0.024569223600460842
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4771838331160365,
"acc_stderr": 0.012756933382823698,
"acc_norm": 0.4771838331160365,
"acc_norm_stderr": 0.012756933382823698
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.018875682938069443,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.018875682938069443
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7810356492566133,
"mc2_stderr": 0.013678421564491373
},
"harness|winogrande|5": {
"acc": 0.8492501973164956,
"acc_stderr": 0.010056094631479686
},
"harness|gsm8k|5": {
"acc": 0.6974981046247157,
"acc_stderr": 0.012652544133186141
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_MaziyarPanahi__YamshadowStrangemerges_32_Experiment24Ognoexperiment27 | open-llm-leaderboard-old | "2024-04-09T10:31:40Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T10:31:18Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/YamshadowStrangemerges_32_Experiment24Ognoexperiment27
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/YamshadowStrangemerges_32_Experiment24Ognoexperiment27](https://huggingface.co/MaziyarPanahi/YamshadowStrangemerges_32_Experiment24Ognoexperiment27)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__YamshadowStrangemerges_32_Experiment24Ognoexperiment27\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T10:28:53.543551](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowStrangemerges_32_Experiment24Ognoexperiment27/blob/main/results_2024-04-09T10-28-53.543551.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508821789914124,\n\
\ \"acc_stderr\": 0.03207251204949206,\n \"acc_norm\": 0.650057066127438,\n\
\ \"acc_norm_stderr\": 0.03274572904790381,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7813193022414375,\n\
\ \"mc2_stderr\": 0.013666530160211392\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7150170648464164,\n \"acc_stderr\": 0.013191348179838795,\n\
\ \"acc_norm\": 0.7337883959044369,\n \"acc_norm_stderr\": 0.012915774781523198\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7171878111929895,\n\
\ \"acc_stderr\": 0.004494454911844619,\n \"acc_norm\": 0.8916550487950607,\n\
\ \"acc_norm_stderr\": 0.003101803574556311\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080341,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080341\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.049888765156985884,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.049888765156985884\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.028133252578815632,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.028133252578815632\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8250319284802043,\n\
\ \"acc_stderr\": 0.013586619219903343,\n \"acc_norm\": 0.8250319284802043,\n\
\ \"acc_norm_stderr\": 0.013586619219903343\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7312138728323699,\n \"acc_stderr\": 0.023868003262500104,\n\
\ \"acc_norm\": 0.7312138728323699,\n \"acc_norm_stderr\": 0.023868003262500104\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.02982074719142248,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.02982074719142248\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015057,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015057\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578334,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578334\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7813193022414375,\n\
\ \"mc2_stderr\": 0.013666530160211392\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8484609313338595,\n \"acc_stderr\": 0.010077698907571776\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6952236542835482,\n \
\ \"acc_stderr\": 0.012679297549515425\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/YamshadowStrangemerges_32_Experiment24Ognoexperiment27
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-53.543551.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T10-28-53.543551.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- '**/details_harness|winogrande|5_2024-04-09T10-28-53.543551.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T10-28-53.543551.parquet'
- config_name: results
data_files:
- split: 2024_04_09T10_28_53.543551
path:
- results_2024-04-09T10-28-53.543551.parquet
- split: latest
path:
- results_2024-04-09T10-28-53.543551.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/YamshadowStrangemerges_32_Experiment24Ognoexperiment27
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/YamshadowStrangemerges_32_Experiment24Ognoexperiment27](https://huggingface.co/MaziyarPanahi/YamshadowStrangemerges_32_Experiment24Ognoexperiment27) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__YamshadowStrangemerges_32_Experiment24Ognoexperiment27",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T10:28:53.543551](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__YamshadowStrangemerges_32_Experiment24Ognoexperiment27/blob/main/results_2024-04-09T10-28-53.543551.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6508821789914124,
"acc_stderr": 0.03207251204949206,
"acc_norm": 0.650057066127438,
"acc_norm_stderr": 0.03274572904790381,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7813193022414375,
"mc2_stderr": 0.013666530160211392
},
"harness|arc:challenge|25": {
"acc": 0.7150170648464164,
"acc_stderr": 0.013191348179838795,
"acc_norm": 0.7337883959044369,
"acc_norm_stderr": 0.012915774781523198
},
"harness|hellaswag|10": {
"acc": 0.7171878111929895,
"acc_stderr": 0.004494454911844619,
"acc_norm": 0.8916550487950607,
"acc_norm_stderr": 0.003101803574556311
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080341,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080341
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.56,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.028133252578815632,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.028133252578815632
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8250319284802043,
"acc_stderr": 0.013586619219903343,
"acc_norm": 0.8250319284802043,
"acc_norm_stderr": 0.013586619219903343
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7312138728323699,
"acc_stderr": 0.023868003262500104,
"acc_norm": 0.7312138728323699,
"acc_norm_stderr": 0.023868003262500104
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.02982074719142248,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.02982074719142248
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015057,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015057
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578334,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578334
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7813193022414375,
"mc2_stderr": 0.013666530160211392
},
"harness|winogrande|5": {
"acc": 0.8484609313338595,
"acc_stderr": 0.010077698907571776
},
"harness|gsm8k|5": {
"acc": 0.6952236542835482,
"acc_stderr": 0.012679297549515425
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
CyberHarem/fujimiya_shihoko_otonarinotenshisamaniitsunomanikadameningennisareteitaken | CyberHarem | "2024-04-09T10:49:17Z" | 0 | 0 | [
"task_categories:text-to-image",
"license:mit",
"size_categories:n<1K",
"library:datasets",
"library:mlcroissant",
"region:us",
"art",
"not-for-all-audiences"
] | [
"text-to-image"
] | "2024-04-09T10:44:50Z" | ---
license: mit
task_categories:
- text-to-image
tags:
- art
- not-for-all-audiences
size_categories:
- n<1K
---
# Dataset of Fujimiya Shihoko/藤宮志保子 (Otonari no Tenshi-sama ni Itsunomanika Dame Ningen ni Sareteita Ken)
This is the dataset of Fujimiya Shihoko/藤宮志保子 (Otonari no Tenshi-sama ni Itsunomanika Dame Ningen ni Sareteita Ken), containing 62 images and their tags.
The core tags of this character are `long_hair, brown_hair, black_hair, brown_eyes, blunt_bangs`, which are pruned in this dataset.
Images are crawled from many sites (e.g. danbooru, pixiv, zerochan ...), the auto-crawling system is powered by [DeepGHS Team](https://github.com/deepghs)([huggingface organization](https://huggingface.co/deepghs)).
## List of Packages
| Name | Images | Size | Download | Type | Description |
|:-----------------|---------:|:----------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------|:---------------------------------------------------------------------|
| raw | 62 | 37.07 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimiya_shihoko_otonarinotenshisamaniitsunomanikadameningennisareteitaken/resolve/main/dataset-raw.zip) | Waifuc-Raw | Raw data with meta information (min edge aligned to 1400 if larger). |
| 1200 | 62 | 37.05 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimiya_shihoko_otonarinotenshisamaniitsunomanikadameningennisareteitaken/resolve/main/dataset-1200.zip) | IMG+TXT | dataset with the shorter side not exceeding 1200 pixels. |
| stage3-p480-1200 | 92 | 50.04 MiB | [Download](https://huggingface.co/datasets/CyberHarem/fujimiya_shihoko_otonarinotenshisamaniitsunomanikadameningennisareteitaken/resolve/main/dataset-stage3-p480-1200.zip) | IMG+TXT | 3-stage cropped dataset with the area not less than 480x480 pixels. |
### Load Raw Dataset with Waifuc
We provide raw dataset (including tagged images) for [waifuc](https://deepghs.github.io/waifuc/main/tutorials/installation/index.html) loading. If you need this, just run the following code
```python
import os
import zipfile
from huggingface_hub import hf_hub_download
from waifuc.source import LocalSource
# download raw archive file
zip_file = hf_hub_download(
repo_id='CyberHarem/fujimiya_shihoko_otonarinotenshisamaniitsunomanikadameningennisareteitaken',
repo_type='dataset',
filename='dataset-raw.zip',
)
# extract files to your directory
dataset_dir = 'dataset_dir'
os.makedirs(dataset_dir, exist_ok=True)
with zipfile.ZipFile(zip_file, 'r') as zf:
zf.extractall(dataset_dir)
# load the dataset with waifuc
source = LocalSource(dataset_dir)
for item in source:
print(item.image, item.meta['filename'], item.meta['tags'])
```
## List of Clusters
List of tag clustering result, maybe some outfits can be mined here.
### Raw Text Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | Tags |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:-------------------------------------------------------------------------------------------------------------------------------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | 1girl, closed_mouth, smile, portrait, black_eyes, solo, turtleneck, voice_actor, looking_at_viewer, necklace, purple_jacket, sweater |
| 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | 1girl, portrait, solo, open_mouth, profile, blush, from_side, indoors, :d, blurry_background, close-up |
| 2 | 15 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | 1girl, solo, smile, upper_body, white_shirt, closed_mouth, brown_jacket, indoors |
### Table Version
| # | Samples | Img-1 | Img-2 | Img-3 | Img-4 | Img-5 | 1girl | closed_mouth | smile | portrait | black_eyes | solo | turtleneck | voice_actor | looking_at_viewer | necklace | purple_jacket | sweater | open_mouth | profile | blush | from_side | indoors | :d | blurry_background | close-up | upper_body | white_shirt | brown_jacket |
|----:|----------:|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------------------------------|:--------|:---------------|:--------|:-----------|:-------------|:-------|:-------------|:--------------|:--------------------|:-----------|:----------------|:----------|:-------------|:----------|:--------|:------------|:----------|:-----|:--------------------|:-----------|:-------------|:--------------|:---------------|
| 0 | 18 | ![](samples/0/clu0-sample0.png) | ![](samples/0/clu0-sample1.png) | ![](samples/0/clu0-sample2.png) | ![](samples/0/clu0-sample3.png) | ![](samples/0/clu0-sample4.png) | X | X | X | X | X | X | X | X | X | X | X | X | | | | | | | | | | | |
| 1 | 8 | ![](samples/1/clu1-sample0.png) | ![](samples/1/clu1-sample1.png) | ![](samples/1/clu1-sample2.png) | ![](samples/1/clu1-sample3.png) | ![](samples/1/clu1-sample4.png) | X | | | X | | X | | | | | | | X | X | X | X | X | X | X | X | | | |
| 2 | 15 | ![](samples/2/clu2-sample0.png) | ![](samples/2/clu2-sample1.png) | ![](samples/2/clu2-sample2.png) | ![](samples/2/clu2-sample3.png) | ![](samples/2/clu2-sample4.png) | X | X | X | | | X | | | | | | | | | | | X | | | | X | X | X |
|
HiTZ/AbstRCT-ES | HiTZ | "2024-04-09T11:03:46Z" | 0 | 0 | [
"language:es",
"license:cc-by-nc-sa-4.0",
"arxiv:2301.10527",
"region:us"
] | null | "2024-04-09T10:49:18Z" | ---
license: cc-by-nc-sa-4.0
language:
- es
pretty_name: AbstRCT-ES
---
---
dataset_info:
- config_name: es
data_files:
- split: neoplasm_train
path: es/neoplasm_train-*
- split: neoplasm_dev
path: es/neoplasm_dev-*
- split: neoplasm_test
path: es/neoplasm_test-*
- split: glaucoma_test
path: es/glaucoma_test-*
- split: mixed_test
path: es/mixed_test-*
license: apache-2.0
task_categories:
- token-classification
language:
- es
tags:
- biology
- medical
pretty_name: AbstRCT-ES
---
<p align="center">
<br>
<img src="http://www.ixa.eus/sites/default/files/anitdote.png" style="width: 30%;">
<h2 align="center">AbstRCT-ES</h2>
<be>
We translate the [AbstRCT English Argument Mining Dataset](https://gitlab.com/tomaye/abstrct) to generate a parallel Spanish version
using DeepL; labels are projected using [Easy Label Projection](https://github.com/ikergarcia1996/Easy-Label-Projection) and manually corrected.
- 📖 Paper: [Crosslingual Argument Mining in the Medical Domain](https://arxiv.org/abs/2301.10527)
- 🌐 Project Website: [https://univ-cotedazur.eu/antidote](https://univ-cotedazur.eu/antidote)
- Code: [https://github.com/ragerri/abstrct-projections/tree/final](https://github.com/ragerri/abstrct-projections/tree/final)
- Funding: CHIST-ERA XAI 2019 call. Antidote (PCI2020-120717-2) funded by MCIN/AEI /10.13039/501100011033 and by European Union NextGenerationEU/PRTR
## Labels
```python
{
"O": 0,
"B-Claim": 1,
"I-Claim": 2,
"B-Premise": 3,
"I-Premise": 4,
}
```
A `claim` is a concluding statement made by the author about the outcome of the study. In the medical domain it may be an assertion of a diagnosis or a treatment.
A `premise` corresponds to an observation or measurement in the study (ground truth), which supports or attacks another argument component, usually a claim.
It is important that they are observed facts, therefore, credible without further evidence.
## Citation
````bibtex
@misc{yeginbergen2024crosslingual,
title={Cross-lingual Argument Mining in the Medical Domain},
author={Anar Yeginbergen and Rodrigo Agerri},
year={2024},
eprint={2301.10527},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```` |
Ethical-Lens/Tox1K | Ethical-Lens | "2024-04-09T10:55:35Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T10:53:16Z" | ---
license: apache-2.0
---
|
OpenDevin/SWE-bench-devin-passed | OpenDevin | "2024-04-09T12:34:48Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T10:53:32Z" | ---
license: mit
dataset_info:
features:
- name: repo
dtype: string
- name: instance_id
dtype: string
- name: base_commit
dtype: string
- name: patch
dtype: string
- name: test_patch
dtype: string
- name: problem_statement
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: version
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
splits:
- name: test
num_bytes: 1442151.0265911072
num_examples: 79
download_size: 299539
dataset_size: 1442151.0265911072
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_lemon-mint__gemma-7b-openhermes-v0.80 | open-llm-leaderboard-old | "2024-04-09T11:28:19Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T11:09:47Z" | ---
pretty_name: Evaluation run of lemon-mint/gemma-7b-openhermes-v0.80
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [lemon-mint/gemma-7b-openhermes-v0.80](https://huggingface.co/lemon-mint/gemma-7b-openhermes-v0.80)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_lemon-mint__gemma-7b-openhermes-v0.80\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T11:26:08.031314](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-7b-openhermes-v0.80/blob/main/results_2024-04-09T11-26-08.031314.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5607727208540026,\n\
\ \"acc_stderr\": 0.03374009542716251,\n \"acc_norm\": 0.5646252525030766,\n\
\ \"acc_norm_stderr\": 0.034417724800141,\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4706193972805725,\n\
\ \"mc2_stderr\": 0.015617837881275841\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5179180887372014,\n \"acc_stderr\": 0.014602005585490976,\n\
\ \"acc_norm\": 0.5435153583617748,\n \"acc_norm_stderr\": 0.014555949760496442\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5553674566819359,\n\
\ \"acc_stderr\": 0.004959094146471529,\n \"acc_norm\": 0.7369049990041824,\n\
\ \"acc_norm_stderr\": 0.004394136724173006\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562427,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562427\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5849056603773585,\n \"acc_stderr\": 0.03032594578928611,\n\
\ \"acc_norm\": 0.5849056603773585,\n \"acc_norm_stderr\": 0.03032594578928611\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.39,\n\
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.045766654032077615,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.045766654032077615\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4553191489361702,\n \"acc_stderr\": 0.032555253593403555,\n\
\ \"acc_norm\": 0.4553191489361702,\n \"acc_norm_stderr\": 0.032555253593403555\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3684210526315789,\n\
\ \"acc_stderr\": 0.04537815354939392,\n \"acc_norm\": 0.3684210526315789,\n\
\ \"acc_norm_stderr\": 0.04537815354939392\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4021164021164021,\n \"acc_stderr\": 0.025253032554997692,\n \"\
acc_norm\": 0.4021164021164021,\n \"acc_norm_stderr\": 0.025253032554997692\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6741935483870968,\n\
\ \"acc_stderr\": 0.0266620105785671,\n \"acc_norm\": 0.6741935483870968,\n\
\ \"acc_norm_stderr\": 0.0266620105785671\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713547,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713547\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8186528497409327,\n \"acc_stderr\": 0.027807032360686088,\n\
\ \"acc_norm\": 0.8186528497409327,\n \"acc_norm_stderr\": 0.027807032360686088\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5307692307692308,\n \"acc_stderr\": 0.025302958890850154,\n\
\ \"acc_norm\": 0.5307692307692308,\n \"acc_norm_stderr\": 0.025302958890850154\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683515,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683515\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.7669724770642202,\n\
\ \"acc_stderr\": 0.018125669180861507,\n \"acc_norm\": 0.7669724770642202,\n\
\ \"acc_norm_stderr\": 0.018125669180861507\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.3888888888888889,\n \"acc_stderr\": 0.033247089118091176,\n\
\ \"acc_norm\": 0.3888888888888889,\n \"acc_norm_stderr\": 0.033247089118091176\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7009803921568627,\n \"acc_stderr\": 0.032133257173736156,\n \"\
acc_norm\": 0.7009803921568627,\n \"acc_norm_stderr\": 0.032133257173736156\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6591928251121076,\n\
\ \"acc_stderr\": 0.03181149747055359,\n \"acc_norm\": 0.6591928251121076,\n\
\ \"acc_norm_stderr\": 0.03181149747055359\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.648854961832061,\n \"acc_stderr\": 0.04186445163013751,\n\
\ \"acc_norm\": 0.648854961832061,\n \"acc_norm_stderr\": 0.04186445163013751\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04643454608906275,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04643454608906275\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.656441717791411,\n \"acc_stderr\": 0.037311335196738925,\n\
\ \"acc_norm\": 0.656441717791411,\n \"acc_norm_stderr\": 0.037311335196738925\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8376068376068376,\n\
\ \"acc_stderr\": 0.02416161812798774,\n \"acc_norm\": 0.8376068376068376,\n\
\ \"acc_norm_stderr\": 0.02416161812798774\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7650063856960408,\n\
\ \"acc_stderr\": 0.015162024152278441,\n \"acc_norm\": 0.7650063856960408,\n\
\ \"acc_norm_stderr\": 0.015162024152278441\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5982658959537572,\n \"acc_stderr\": 0.026394104177643634,\n\
\ \"acc_norm\": 0.5982658959537572,\n \"acc_norm_stderr\": 0.026394104177643634\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2335195530726257,\n\
\ \"acc_stderr\": 0.014149575348976273,\n \"acc_norm\": 0.2335195530726257,\n\
\ \"acc_norm_stderr\": 0.014149575348976273\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5980392156862745,\n \"acc_stderr\": 0.028074158947600656,\n\
\ \"acc_norm\": 0.5980392156862745,\n \"acc_norm_stderr\": 0.028074158947600656\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.594855305466238,\n\
\ \"acc_stderr\": 0.02788238379132596,\n \"acc_norm\": 0.594855305466238,\n\
\ \"acc_norm_stderr\": 0.02788238379132596\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6296296296296297,\n \"acc_stderr\": 0.02686949074481525,\n\
\ \"acc_norm\": 0.6296296296296297,\n \"acc_norm_stderr\": 0.02686949074481525\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.02927553215970472,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.02927553215970472\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4282920469361147,\n\
\ \"acc_stderr\": 0.01263822388031316,\n \"acc_norm\": 0.4282920469361147,\n\
\ \"acc_norm_stderr\": 0.01263822388031316\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.4522058823529412,\n \"acc_stderr\": 0.030233758551596445,\n\
\ \"acc_norm\": 0.4522058823529412,\n \"acc_norm_stderr\": 0.030233758551596445\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5751633986928104,\n \"acc_stderr\": 0.019997973035458333,\n \
\ \"acc_norm\": 0.5751633986928104,\n \"acc_norm_stderr\": 0.019997973035458333\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.029923100563683906,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.029923100563683906\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7810945273631841,\n\
\ \"acc_stderr\": 0.029239174636647,\n \"acc_norm\": 0.7810945273631841,\n\
\ \"acc_norm_stderr\": 0.029239174636647\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7660818713450293,\n \"acc_stderr\": 0.03246721765117826,\n\
\ \"acc_norm\": 0.7660818713450293,\n \"acc_norm_stderr\": 0.03246721765117826\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2937576499388005,\n\
\ \"mc1_stderr\": 0.015945068581236614,\n \"mc2\": 0.4706193972805725,\n\
\ \"mc2_stderr\": 0.015617837881275841\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6677190213101816,\n \"acc_stderr\": 0.013238316554236526\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4336618650492798,\n \
\ \"acc_stderr\": 0.013650728047064686\n }\n}\n```"
repo_url: https://huggingface.co/lemon-mint/gemma-7b-openhermes-v0.80
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-07-43.510982.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-26-08.031314.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-26-08.031314.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- '**/details_harness|winogrande|5_2024-04-09T11-07-43.510982.parquet'
- split: 2024_04_09T11_26_08.031314
path:
- '**/details_harness|winogrande|5_2024-04-09T11-26-08.031314.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T11-26-08.031314.parquet'
- config_name: results
data_files:
- split: 2024_04_09T11_07_43.510982
path:
- results_2024-04-09T11-07-43.510982.parquet
- split: 2024_04_09T11_26_08.031314
path:
- results_2024-04-09T11-26-08.031314.parquet
- split: latest
path:
- results_2024-04-09T11-26-08.031314.parquet
---
# Dataset Card for Evaluation run of lemon-mint/gemma-7b-openhermes-v0.80
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [lemon-mint/gemma-7b-openhermes-v0.80](https://huggingface.co/lemon-mint/gemma-7b-openhermes-v0.80) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_lemon-mint__gemma-7b-openhermes-v0.80",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T11:26:08.031314](https://huggingface.co/datasets/open-llm-leaderboard/details_lemon-mint__gemma-7b-openhermes-v0.80/blob/main/results_2024-04-09T11-26-08.031314.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5607727208540026,
"acc_stderr": 0.03374009542716251,
"acc_norm": 0.5646252525030766,
"acc_norm_stderr": 0.034417724800141,
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4706193972805725,
"mc2_stderr": 0.015617837881275841
},
"harness|arc:challenge|25": {
"acc": 0.5179180887372014,
"acc_stderr": 0.014602005585490976,
"acc_norm": 0.5435153583617748,
"acc_norm_stderr": 0.014555949760496442
},
"harness|hellaswag|10": {
"acc": 0.5553674566819359,
"acc_stderr": 0.004959094146471529,
"acc_norm": 0.7369049990041824,
"acc_norm_stderr": 0.004394136724173006
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562427,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562427
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5849056603773585,
"acc_stderr": 0.03032594578928611,
"acc_norm": 0.5849056603773585,
"acc_norm_stderr": 0.03032594578928611
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.045766654032077615,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.045766654032077615
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4553191489361702,
"acc_stderr": 0.032555253593403555,
"acc_norm": 0.4553191489361702,
"acc_norm_stderr": 0.032555253593403555
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3684210526315789,
"acc_stderr": 0.04537815354939392,
"acc_norm": 0.3684210526315789,
"acc_norm_stderr": 0.04537815354939392
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4021164021164021,
"acc_stderr": 0.025253032554997692,
"acc_norm": 0.4021164021164021,
"acc_norm_stderr": 0.025253032554997692
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6741935483870968,
"acc_stderr": 0.0266620105785671,
"acc_norm": 0.6741935483870968,
"acc_norm_stderr": 0.0266620105785671
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713547,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713547
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8186528497409327,
"acc_stderr": 0.027807032360686088,
"acc_norm": 0.8186528497409327,
"acc_norm_stderr": 0.027807032360686088
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5307692307692308,
"acc_stderr": 0.025302958890850154,
"acc_norm": 0.5307692307692308,
"acc_norm_stderr": 0.025302958890850154
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683515,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683515
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7669724770642202,
"acc_stderr": 0.018125669180861507,
"acc_norm": 0.7669724770642202,
"acc_norm_stderr": 0.018125669180861507
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.033247089118091176,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.033247089118091176
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7009803921568627,
"acc_stderr": 0.032133257173736156,
"acc_norm": 0.7009803921568627,
"acc_norm_stderr": 0.032133257173736156
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6591928251121076,
"acc_stderr": 0.03181149747055359,
"acc_norm": 0.6591928251121076,
"acc_norm_stderr": 0.03181149747055359
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.648854961832061,
"acc_stderr": 0.04186445163013751,
"acc_norm": 0.648854961832061,
"acc_norm_stderr": 0.04186445163013751
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04643454608906275,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04643454608906275
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.656441717791411,
"acc_stderr": 0.037311335196738925,
"acc_norm": 0.656441717791411,
"acc_norm_stderr": 0.037311335196738925
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8376068376068376,
"acc_stderr": 0.02416161812798774,
"acc_norm": 0.8376068376068376,
"acc_norm_stderr": 0.02416161812798774
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7650063856960408,
"acc_stderr": 0.015162024152278441,
"acc_norm": 0.7650063856960408,
"acc_norm_stderr": 0.015162024152278441
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5982658959537572,
"acc_stderr": 0.026394104177643634,
"acc_norm": 0.5982658959537572,
"acc_norm_stderr": 0.026394104177643634
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2335195530726257,
"acc_stderr": 0.014149575348976273,
"acc_norm": 0.2335195530726257,
"acc_norm_stderr": 0.014149575348976273
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5980392156862745,
"acc_stderr": 0.028074158947600656,
"acc_norm": 0.5980392156862745,
"acc_norm_stderr": 0.028074158947600656
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.594855305466238,
"acc_stderr": 0.02788238379132596,
"acc_norm": 0.594855305466238,
"acc_norm_stderr": 0.02788238379132596
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.02686949074481525,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.02686949074481525
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.02927553215970472,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.02927553215970472
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4282920469361147,
"acc_stderr": 0.01263822388031316,
"acc_norm": 0.4282920469361147,
"acc_norm_stderr": 0.01263822388031316
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.4522058823529412,
"acc_stderr": 0.030233758551596445,
"acc_norm": 0.4522058823529412,
"acc_norm_stderr": 0.030233758551596445
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5751633986928104,
"acc_stderr": 0.019997973035458333,
"acc_norm": 0.5751633986928104,
"acc_norm_stderr": 0.019997973035458333
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.029923100563683906,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.029923100563683906
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7810945273631841,
"acc_stderr": 0.029239174636647,
"acc_norm": 0.7810945273631841,
"acc_norm_stderr": 0.029239174636647
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7660818713450293,
"acc_stderr": 0.03246721765117826,
"acc_norm": 0.7660818713450293,
"acc_norm_stderr": 0.03246721765117826
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2937576499388005,
"mc1_stderr": 0.015945068581236614,
"mc2": 0.4706193972805725,
"mc2_stderr": 0.015617837881275841
},
"harness|winogrande|5": {
"acc": 0.6677190213101816,
"acc_stderr": 0.013238316554236526
},
"harness|gsm8k|5": {
"acc": 0.4336618650492798,
"acc_stderr": 0.013650728047064686
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AnonymousPaperSubmissions/RoBERTa_eval_data | AnonymousPaperSubmissions | "2024-04-09T11:16:20Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T11:12:53Z" | ---
license: mit
---
|
AnonymousPaperSubmissions/Testing_raw_data | AnonymousPaperSubmissions | "2024-04-09T11:17:30Z" | 0 | 0 | [
"license:mit",
"size_categories:1K<n<10K",
"format:csv",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T11:17:15Z" | ---
license: mit
---
|
AnonymousPaperSubmissions/Training_CC | AnonymousPaperSubmissions | "2024-04-09T11:18:49Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T11:18:38Z" | ---
license: mit
---
|
holmes26/fluent_noisy | holmes26 | "2024-04-10T10:03:58Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:audio",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T11:19:22Z" | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
- name: noisy_speech
sequence: float64
- name: noise
dtype: string
splits:
- name: train
num_bytes: 1132681520.176
num_examples: 3168
- name: test
num_bytes: 176892283.0
num_examples: 459
download_size: 1042211351
dataset_size: 1309573803.176
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
ML4CO/TSPLIBOriDataset | ML4CO | "2024-05-17T08:45:00Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-09T11:20:33Z" | ---
license: apache-2.0
---
|
u-10bei/merge_aozora_jalaw_jawiki | u-10bei | "2024-04-09T11:22:29Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T11:21:14Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1633860729
num_examples: 1664365
download_size: 741411098
dataset_size: 1633860729
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_ABX-AI__Silver-Sun-11B | open-llm-leaderboard-old | "2024-04-09T11:26:25Z" | 0 | 0 | [
"region:us"
] | null | "2024-04-09T11:26:06Z" | ---
pretty_name: Evaluation run of ABX-AI/Silver-Sun-11B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ABX-AI/Silver-Sun-11B](https://huggingface.co/ABX-AI/Silver-Sun-11B) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-04-09T11:23:48.663620](https://huggingface.co/datasets/open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B/blob/main/results_2024-04-09T11-23-48.663620.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6613299971122604,\n\
\ \"acc_stderr\": 0.03117186211934933,\n \"acc_norm\": 0.6730584240663938,\n\
\ \"acc_norm_stderr\": 0.03199188346673098,\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.618855922705881,\n\
\ \"mc2_stderr\": 0.015586954390037554\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6680887372013652,\n \"acc_stderr\": 0.013760988200880533,\n\
\ \"acc_norm\": 0.6979522184300341,\n \"acc_norm_stderr\": 0.013417519144716413\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.692989444333798,\n\
\ \"acc_stderr\": 0.004603111343213067,\n \"acc_norm\": 0.8791077474606652,\n\
\ \"acc_norm_stderr\": 0.0032533576201717973\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7697368421052632,\n \"acc_stderr\": 0.03426059424403165,\n\
\ \"acc_norm\": 0.7697368421052632,\n \"acc_norm_stderr\": 0.03426059424403165\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249386,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249386\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.034765901043041336,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.034765901043041336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.51,\n\
\ \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.51,\n \
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.04043461861916747,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.04043461861916747\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.48148148148148145,\n \"acc_stderr\": 0.025733641991838987,\n \"\
acc_norm\": 0.48148148148148145,\n \"acc_norm_stderr\": 0.025733641991838987\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8451612903225807,\n \"acc_stderr\": 0.020579287326583227,\n \"\
acc_norm\": 0.8451612903225807,\n \"acc_norm_stderr\": 0.020579287326583227\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n \"\
acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.806060606060606,\n \"acc_stderr\": 0.03087414513656208,\n\
\ \"acc_norm\": 0.806060606060606,\n \"acc_norm_stderr\": 0.03087414513656208\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8838383838383839,\n \"acc_stderr\": 0.02282888177524938,\n \"\
acc_norm\": 0.8838383838383839,\n \"acc_norm_stderr\": 0.02282888177524938\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5833333333333334,\n \"acc_stderr\": 0.033622774366080424,\n \"\
acc_norm\": 0.5833333333333334,\n \"acc_norm_stderr\": 0.033622774366080424\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8676470588235294,\n \"acc_stderr\": 0.02378429752091886,\n \"\
acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.02378429752091886\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8818565400843882,\n \"acc_stderr\": 0.021011052659878463,\n \
\ \"acc_norm\": 0.8818565400843882,\n \"acc_norm_stderr\": 0.021011052659878463\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573973,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573973\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165616,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165616\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069363,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069363\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.49385474860335193,\n\
\ \"acc_stderr\": 0.016721238483631412,\n \"acc_norm\": 0.49385474860335193,\n\
\ \"acc_norm_stderr\": 0.016721238483631412\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817962,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817962\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7808641975308642,\n \"acc_stderr\": 0.023016705640262196,\n\
\ \"acc_norm\": 0.7808641975308642,\n \"acc_norm_stderr\": 0.023016705640262196\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.500651890482399,\n\
\ \"acc_stderr\": 0.01277022525225556,\n \"acc_norm\": 0.500651890482399,\n\
\ \"acc_norm_stderr\": 0.01277022525225556\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7610294117647058,\n \"acc_stderr\": 0.025905280644893006,\n\
\ \"acc_norm\": 0.7610294117647058,\n \"acc_norm_stderr\": 0.025905280644893006\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.696078431372549,\n \"acc_stderr\": 0.018607552131279827,\n \
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.018607552131279827\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142777,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142777\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.93,\n \"acc_stderr\": 0.025643239997624294,\n \
\ \"acc_norm\": 0.93,\n \"acc_norm_stderr\": 0.025643239997624294\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.03851597683718533,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.03851597683718533\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47613219094247244,\n\
\ \"mc1_stderr\": 0.017483547156961578,\n \"mc2\": 0.618855922705881,\n\
\ \"mc2_stderr\": 0.015586954390037554\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8413575374901342,\n \"acc_stderr\": 0.010267936243028214\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.00530705079605762,\n \
\ \"acc_stderr\": 0.0020013057209480405\n }\n}\n```"
repo_url: https://huggingface.co/ABX-AI/Silver-Sun-11B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-04-09T11-23-48.663620.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- '**/details_harness|winogrande|5_2024-04-09T11-23-48.663620.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-04-09T11-23-48.663620.parquet'
- config_name: results
data_files:
- split: 2024_04_09T11_23_48.663620
path:
- results_2024-04-09T11-23-48.663620.parquet
- split: latest
path:
- results_2024-04-09T11-23-48.663620.parquet
---
# Dataset Card for Evaluation run of ABX-AI/Silver-Sun-11B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ABX-AI/Silver-Sun-11B](https://huggingface.co/ABX-AI/Silver-Sun-11B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-04-09T11:23:48.663620](https://huggingface.co/datasets/open-llm-leaderboard/details_ABX-AI__Silver-Sun-11B/blob/main/results_2024-04-09T11-23-48.663620.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6613299971122604,
"acc_stderr": 0.03117186211934933,
"acc_norm": 0.6730584240663938,
"acc_norm_stderr": 0.03199188346673098,
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.618855922705881,
"mc2_stderr": 0.015586954390037554
},
"harness|arc:challenge|25": {
"acc": 0.6680887372013652,
"acc_stderr": 0.013760988200880533,
"acc_norm": 0.6979522184300341,
"acc_norm_stderr": 0.013417519144716413
},
"harness|hellaswag|10": {
"acc": 0.692989444333798,
"acc_stderr": 0.004603111343213067,
"acc_norm": 0.8791077474606652,
"acc_norm_stderr": 0.0032533576201717973
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7697368421052632,
"acc_stderr": 0.03426059424403165,
"acc_norm": 0.7697368421052632,
"acc_norm_stderr": 0.03426059424403165
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249386,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249386
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.034765901043041336,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.034765901043041336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.04043461861916747,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.04043461861916747
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.025733641991838987,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.025733641991838987
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8451612903225807,
"acc_stderr": 0.020579287326583227,
"acc_norm": 0.8451612903225807,
"acc_norm_stderr": 0.020579287326583227
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.806060606060606,
"acc_stderr": 0.03087414513656208,
"acc_norm": 0.806060606060606,
"acc_norm_stderr": 0.03087414513656208
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8838383838383839,
"acc_stderr": 0.02282888177524938,
"acc_norm": 0.8838383838383839,
"acc_norm_stderr": 0.02282888177524938
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.033622774366080424,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.033622774366080424
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.02378429752091886,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.02378429752091886
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8818565400843882,
"acc_stderr": 0.021011052659878463,
"acc_norm": 0.8818565400843882,
"acc_norm_stderr": 0.021011052659878463
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573973,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573973
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165616,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165616
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069363,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069363
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.49385474860335193,
"acc_stderr": 0.016721238483631412,
"acc_norm": 0.49385474860335193,
"acc_norm_stderr": 0.016721238483631412
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817962,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817962
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7808641975308642,
"acc_stderr": 0.023016705640262196,
"acc_norm": 0.7808641975308642,
"acc_norm_stderr": 0.023016705640262196
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.500651890482399,
"acc_stderr": 0.01277022525225556,
"acc_norm": 0.500651890482399,
"acc_norm_stderr": 0.01277022525225556
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7610294117647058,
"acc_stderr": 0.025905280644893006,
"acc_norm": 0.7610294117647058,
"acc_norm_stderr": 0.025905280644893006
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.018607552131279827,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.018607552131279827
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142777,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142777
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900798,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.93,
"acc_stderr": 0.025643239997624294,
"acc_norm": 0.93,
"acc_norm_stderr": 0.025643239997624294
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.03851597683718533,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.03851597683718533
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47613219094247244,
"mc1_stderr": 0.017483547156961578,
"mc2": 0.618855922705881,
"mc2_stderr": 0.015586954390037554
},
"harness|winogrande|5": {
"acc": 0.8413575374901342,
"acc_stderr": 0.010267936243028214
},
"harness|gsm8k|5": {
"acc": 0.00530705079605762,
"acc_stderr": 0.0020013057209480405
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ML4CO/SATLIBOriDataset | ML4CO | "2024-04-09T12:26:17Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T11:38:09Z" | ---
license: apache-2.0
---
|
niajmahmud/Multimodal | niajmahmud | "2024-05-31T08:50:30Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"modality:timeseries",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T11:50:30Z" | ---
dataset_info:
features:
- name: combined_features
sequence: float32
- name: emotion
dtype: string
splits:
- name: multimodal_features
num_bytes: 17166278040
num_examples: 3096
download_size: 17190079193
dataset_size: 17166278040
configs:
- config_name: default
data_files:
- split: multimodal_features
path: data/multimodal_features-*
---
|
cmammides/BIOMON | cmammides | "2024-06-28T06:41:36Z" | 0 | 0 | [
"task_categories:feature-extraction",
"language:en",
"license:cc-by-4.0",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"doi:10.57967/hf/2613",
"region:us",
"biology"
] | [
"feature-extraction"
] | "2024-04-09T12:23:12Z" | ---
license: cc-by-4.0
task_categories:
- feature-extraction
language:
- en
tags:
- biology
pretty_name: BIOMON
size_categories:
- n>1T
---
"BIOMON: Using passive acoustic monitoring methods to survey bird communities in biodiverse agricultural farmlands in the EU"
https://cordis.europa.eu/project/id/101090273
1/6/2022 - 31/5/2024
BIOMON is funded by the European Union's Horizon Europe programme, ERA Talents, under grant agreement 101090273
A complete description of the dataset can be found in the following article:
1. Mammides, C., Ieronymidou, C. & Papadopoulos, H. (2024). An ecoacoustic dataset collected on the island of Cyprus in the Mediterranean Basin biodiversity hotspot. Preprint at https://doi.org/10.21203/rs.3.rs-4635704/v1
See also:
1. Mammides, C. et al. (2024). The Combined Effectiveness of Acoustic Indices in Measuring Bird Species Richness in Biodiverse Sites in Cyprus, China, and Australia. SSRN Scholarly Paper at https://doi.org/10.2139/ssrn.4823337.
2. Mammides, C., Huang, G., Sreekar, R., Ieronymidou, C. & Papadopoulos, H. (2024). A novel approach for calculating prediction uncertainty when using acoustic indices and machine learning algorithms to monitor animal communities. 30 May 2024, PREPRINT (Version 1) available at Research Square https://doi.org/10.21203/rs.3.rs-4494063/v1.
---
license: cc
tags:
- ecoacoustics
- biodiversity monitoring
- soundscape
---
Contact information:
Christos Mammides (cmammides@outlook.com)
https://cmammides.wordpress.com/ |
SUMM91/baramGB_AI | SUMM91 | "2024-04-24T07:21:48Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T12:37:09Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4970
num_examples: 44
download_size: 1806
dataset_size: 4970
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
ChrisWilson/twitter_dataset_1712666350 | ChrisWilson | "2024-04-09T12:39:37Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T12:39:10Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 7534
num_examples: 20
download_size: 8878
dataset_size: 7534
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
DynamicSuperb/InstrumentPitchClassification_Nsynth | DynamicSuperb | "2024-07-20T12:27:59Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:audio",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T12:50:25Z" | ---
dataset_info:
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: instruction
dtype: string
- name: file
dtype: string
- name: label
dtype: string
splits:
- name: test
num_bytes: 115764152.0
num_examples: 900
download_size: 88818787
dataset_size: 115764152.0
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
eightplay/test | eightplay | "2024-04-09T12:52:46Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-04-09T12:52:46Z" | ---
license: apache-2.0
---
|
Willl007/en_summarized | Willl007 | "2024-04-09T13:13:46Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T12:54:13Z" | ---
dataset_info:
features:
- name: Text
dtype: string
- name: Summarization
dtype: string
splits:
- name: train
num_bytes: 1760021.0157458815
num_examples: 7690
- name: validation
num_bytes: 498253.0235733139
num_examples: 2177
- name: test
num_bytes: 256335.9606808046
num_examples: 1120
download_size: 1883403
dataset_size: 2514610.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
Melricflash/CW_MedAbstracts | Melricflash | "2024-04-09T15:00:46Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:10K<n<100K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T13:17:35Z" | ---
license: apache-2.0
---
|
TrainingDataPro/caucasian-people-kyc-photo-dataset | TrainingDataPro | "2024-04-25T11:49:53Z" | 0 | 1 | [
"task_categories:image-classification",
"task_categories:image-to-image",
"language:en",
"license:cc-by-nc-nd-4.0",
"size_categories:n<1K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us",
"code",
"legal",
"finance"
] | [
"image-classification",
"image-to-image"
] | "2024-04-09T13:32:10Z" | ---
license: cc-by-nc-nd-4.0
task_categories:
- image-classification
- image-to-image
language:
- en
tags:
- code
- legal
- finance
---
# Know Your Customer Dataset, Face Detection and Re-identification
# The similar dataset that includes all ethnicities - [Selfies and ID Dataset](https://trainingdata.pro/datasets/document-photos-and-selfies?utm_source=huggingface&utm_medium=cpc&utm_campaign=caucasian_kyc)
**80,000**+ photos including **10,600**+ document photos from **5,300** people from **28** countries.
The dataset includes 2 photos of a person from his documents and 13 selfies. All people presented in the dataset are caucasian. The dataset contains a variety of images capturing individuals from diverse *backgrounds and age groups*.
**Photo documents contains only a photo of a person. All personal information from the document is hidden**
![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2F9ad166a8728e7299087a69793e420918%2FFrame%2015%20(1).png?generation=1712143714014867&alt=media)
### Documents in the dataset
- Passports
- International passport
- Driver licenses
- Student cards
- Health certificate
- Pensioner's ID
- Pass to work
- Other documents
The dataset can be utilized for a wide range of tasks, including **face recognition, emotion detection, age estimation, gender classification**, or any problem related to human image analysis.
# 💴 For Commercial Usage: Full version of the dataset includes 80,000+ photos of people, leave a request on **[TrainingData](https://trainingdata.pro/datasets/caucasian-kyc-photo?utm_source=huggingface&utm_medium=cpc&utm_campaign=caucasian_kyc)** to buy the dataset
### Metadata for the full dataset:
- **assignment_id** - unique identifier of the media file
- **worker_id** - unique identifier of the person
- **age** - age of the person
- **gender** - gender of the person
- **country** - country of the person
- **ethnicity** - ethnicity of the person
- **photo_1_extension, photo_2_extension, …, photo_15_extension** - photo extensions in the dataset
- **photo_1_resolution, photo_2_resolution, …, photo_15_resolution** - photo resolution in the dataset
### Statistics for the dataset
![](https://www.googleapis.com/download/storage/v1/b/kaggle-user-content/o/inbox%2F12421376%2F5a5be7a59953aa5e05014dbc88c7740b%2FFrame%2093.png?generation=1712832246364646&alt=media)
# 💴 Buy the Dataset: This is just an example of the data. Leave a request on **[https://trainingdata.pro/datasets](https://trainingdata.pro/datasets/caucasian-kyc-photo?utm_source=huggingface&utm_medium=cpc&utm_campaign=caucasian_kyc) to learn about the price and buy the dataset**
# Content
The dataset consists of:
- **files** - includes 7 folders corresponding to each person and including 15 images (2 id photos and 13 selfies),
- **.csv file** - contains information about the images and people in the dataset
### File with the extension .csv
- **id**: id of the person,
- **age** - age of the person,
- **gender** - gender of the person,
- **country** - country of the person,
- **id_1, id_2**: link to access id photos,
- **selfie_1, selfie_2, ..., selfie_13**: link to access each of the 13 selfies of the person
## **[TrainingData](https://trainingdata.pro/datasets/caucasian-kyc-photo?utm_source=huggingface&utm_medium=cpc&utm_campaign=caucasian_kyc)** provides high-quality data annotation tailored to your needs
More datasets in TrainingData's Kaggle account: **https://www.kaggle.com/trainingdatapro/datasets**
TrainingData's GitHub: **https://github.com/Trainingdata-datamarket/TrainingData_All_datasets**
*keywords: biometric system, biometric dataset, face recognition database, face recognition dataset, face detection dataset, facial analysis, object detection dataset, deep learning datasets, computer vision datset, human images dataset, human faces dataset, machine learning, image-to-image, re-identification, id photos, selfies and paired id, photos, id verification models, passport, id card image, digital photo-identification, caucasian people, caucasian dataset* |
likhithasapu/ai-human-gen | likhithasapu | "2024-04-14T19:20:55Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T13:35:43Z" | ---
dataset_info:
features:
- name: id
dtype: int64
- name: context
dtype: string
- name: response
dtype: string
- name: human-generated
dtype: int64
splits:
- name: validation
num_bytes: 256797386
num_examples: 524342
- name: train
num_bytes: 230864915
num_examples: 500000
- name: test
num_bytes: 256797386
num_examples: 524342
download_size: 516175262
dataset_size: 744459687
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
AMead10/lvl_5_vital_wikipedia_articles_split | AMead10 | "2024-04-10T02:55:10Z" | 0 | 0 | [
"language:en",
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T13:40:57Z" | ---
language:
- en
dataset_info:
features:
- name: title
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 1183929798
num_examples: 2110473
download_size: 680523539
dataset_size: 1183929798
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
Text split version of [level 5 vital wikipedia articles](https://huggingface.co/datasets/AMead10/lvl_5_vital_wikipedia_articles). Text has been split on `\n\n`, and any instances where the length of the text was < 10 words were removed to avoid headings. |
OliverGN/job_listings | OliverGN | "2024-04-09T14:17:10Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:10K<n<100K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T14:16:38Z" | ---
license: apache-2.0
---
|
Minn0717/wsi | Minn0717 | "2024-04-09T14:58:51Z" | 0 | 0 | [
"license:unknown",
"size_categories:1K<n<10K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T14:51:08Z" | ---
license: unknown
---
|
ricahrd/McKevinV2 | ricahrd | "2024-04-09T15:09:41Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T14:54:31Z" | ---
license: openrail
---
|
dearprakash/tamil_proverbs | dearprakash | "2024-04-12T10:15:19Z" | 0 | 0 | [
"language:ta",
"license:mit",
"size_categories:n<1K",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T15:16:22Z" | ---
language:
- ta
license: mit
---
This is a compilation of proverbs in tamil extracted from public domain content
[Link to the Original source is here](https://archive.org/details/dli.jZY9lup2kZl6TuXGlZQdjZldluhy.TVA_BOK_0006248/page/n1/mode/2up)!
### Work in Progress
This repo will contain the proverbs that are extracted and cleaned up and manually verified for potential errors.
### Disclaimer:
Since there is a manual process involved, you are expected to verify before using this.
|
Minn0717/my_wsi | Minn0717 | "2024-04-18T16:55:38Z" | 0 | 0 | [
"license:unknown",
"size_categories:n<1K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T15:17:53Z" | ---
license: unknown
---
|
marcus2000/saiga__timelist_dataset | marcus2000 | "2024-04-09T15:18:49Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T15:18:44Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: label
dtype: string
splits:
- name: train
num_bytes: 9708659.888057634
num_examples: 3248
- name: test
num_bytes: 1079072.1119423662
num_examples: 361
download_size: 4335166
dataset_size: 10787732.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mikhail-panzo/malay-processed | mikhail-panzo | "2024-05-04T13:09:44Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T15:57:08Z" | ---
dataset_info:
features:
- name: speaker_embeddings
sequence: float32
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
splits:
- name: train
num_bytes: 11280403594.14842
num_examples: 76392
- name: test
num_bytes: 1253525841.8515804
num_examples: 8489
download_size: 11731225697
dataset_size: 12533929436.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
tempertrash/QR_dataset | tempertrash | "2024-04-09T16:14:40Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:image",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T16:03:56Z" | ---
dataset_info:
features:
- name: QR
dtype: image
- name: round_QR
dtype: image
splits:
- name: train
num_bytes: 152542030.0
num_examples: 30000
download_size: 152851000
dataset_size: 152542030.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nitinbhayana/spell_error_data | nitinbhayana | "2024-04-09T16:13:39Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T16:13:33Z" | ---
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 80716
num_examples: 148
download_size: 44348
dataset_size: 80716
---
# Dataset Card for "spell_error_data"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
miriad/miriad-v0-6M | miriad | "2024-04-10T07:14:57Z" | 0 | 1 | [
"region:us"
] | null | "2024-04-09T16:43:21Z" | ---
dataset_info:
features:
- name: qa_id
dtype: string
- name: paper_id
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: paper_url
dtype: string
- name: paper_title
dtype: string
- name: passage_text
dtype: string
- name: passage_position
dtype: string
- name: year
dtype: int64
splits:
- name: train
num_bytes: 34415615449
num_examples: 6430601
download_size: 8205967742
dataset_size: 34415615449
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
theoracle/Italian.sentiment.analysis | theoracle | "2024-04-09T16:54:06Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"sentiment",
"italian",
"news headlines"
] | null | "2024-04-09T16:49:05Z" | ---
license: apache-2.0
tags:
- sentiment
- italian
- news headlines
size_categories:
- "n<1K"
Dataset Description:
General Description: "This dataset consists of Italian news headlines with annotated sentiments. Each headline is enclosed in square brackets followed by the sentiment label 'positive', 'neutral', or 'negative'."
Purpose: "The dataset is designed for training and evaluating sentiment analysis models on Italian-language text, particularly news headlines."
Dataset Structure:
Size and Scope: "The dataset contains a small number of annotated headlines, suitable for initial model training or testing in sentiment analysis tasks."
Data Fields: "Each record includes a 'headline' text field and a 'sentiment' label."
Example:
- headline: "[ mi fa sbagliare tutte le paroleeeee.]"
sentiment: "negative"
- headline: "[ perfetto hai visto poi alla fine anche oggi e passato..]"
sentiment: "neutral"
- headline: "[Rutelli: appoggio al governo #monti, sta lavorando bene #ballarò #osservatoriotivvù]"
sentiment: "positive"
Use Cases:
Sentiment Analysis Model Training: "Researchers and practitioners can use this dataset to develop and train sentiment analysis models for the Italian language."
Academic Research: "The dataset can serve as a basis for studies in computational linguistics focusing on sentiment analysis in Italian news media."
---
|
ChrisWilson/twitter_dataset_1712682316 | ChrisWilson | "2024-04-09T17:06:13Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T17:05:16Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 24492
num_examples: 60
download_size: 17757
dataset_size: 24492
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
bunkalab/medium-sample-technology-tags | bunkalab | "2024-04-09T17:15:05Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T17:13:08Z" | ---
dataset_info:
features:
- name: title
dtype: string
- name: tags
dtype: string
- name: doc_id
dtype: int64
splits:
- name: train
num_bytes: 113529
num_examples: 1394
download_size: 68736
dataset_size: 113529
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mihaien/my-full-dataset-64 | mihaien | "2024-04-09T17:37:52Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:image",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T17:37:29Z" | ---
dataset_info:
features:
- name: image
dtype: image
splits:
- name: train
num_bytes: 12627584.536
num_examples: 6476
download_size: 10659195
dataset_size: 12627584.536
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jmcastelo17/FIFA_dataset | jmcastelo17 | "2024-04-09T17:50:11Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T17:45:22Z" | ---
dataset_info:
features:
- name: audio
dtype: binary
- name: text
dtype: string
splits:
- name: train
num_bytes: 328939441
num_examples: 296
download_size: 324971288
dataset_size: 328939441
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nblinh63/twitter_dataset_1712685077 | nblinh63 | "2024-04-09T17:52:04Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T17:51:17Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 206113
num_examples: 596
download_size: 88440
dataset_size: 206113
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
mihirch/SWE-bench__test-style-3__fs-test | mihirch | "2024-04-09T17:52:39Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T17:52:34Z" | ---
configs:
- config_name: default
data_files:
- split: dev
path: data/dev-*
- split: test
path: data/test-*
dataset_info:
features:
- name: instance_id
dtype: string
- name: text
dtype: string
- name: repo
dtype: string
- name: base_commit
dtype: string
- name: problem_statement
dtype: string
- name: hints_text
dtype: string
- name: created_at
dtype: string
- name: patch
dtype: string
- name: test_patch
dtype: string
- name: version
dtype: string
- name: FAIL_TO_PASS
dtype: string
- name: PASS_TO_PASS
dtype: string
- name: environment_setup_commit
dtype: string
splits:
- name: dev
num_bytes: 1001011
num_examples: 23
- name: test
num_bytes: 22730863
num_examples: 300
download_size: 9176758
dataset_size: 23731874
---
# Dataset Card for "SWE-bench__test-style-3__fs-test"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
nblinh63/twitter_dataset_1712686647 | nblinh63 | "2024-04-09T18:35:16Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T18:17:28Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 79216
num_examples: 202
download_size: 37108
dataset_size: 79216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nblinh63/twitter_dataset_1712686712 | nblinh63 | "2024-04-09T18:36:29Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T18:18:32Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 79216
num_examples: 202
download_size: 37108
dataset_size: 79216
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
kings-crown/IsarCodingLearn | kings-crown | "2024-04-23T19:09:44Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-04-09T18:18:52Z" | ---
license: mit
---
|
xz56/minipile-tok-gpt-2048 | xz56 | "2024-04-09T18:38:28Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T18:36:59Z" | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 5965671696
num_examples: 727876
- name: validation
num_bytes: 2770248
num_examples: 338
- name: test
num_bytes: 58625988
num_examples: 7153
download_size: 2464819303
dataset_size: 6027067932
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
nblinh63/twitter_dataset_1712687979 | nblinh63 | "2024-04-09T19:01:16Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T18:39:39Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 78974
num_examples: 200
download_size: 37277
dataset_size: 78974
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
BigTMiami/amazon_25M_10_000_condensed | BigTMiami | "2024-04-09T18:56:35Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T18:56:33Z" | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: attention_mask
sequence: int8
- name: labels
sequence: int64
splits:
- name: train
num_bytes: 11455624
num_examples: 1718
- name: validation
num_bytes: 5774488
num_examples: 866
download_size: 5547175
dataset_size: 17230112
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
nblinh63/twitter_dataset_1712689800 | nblinh63 | "2024-04-09T19:27:40Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T19:10:00Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 80461
num_examples: 201
download_size: 38420
dataset_size: 80461
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
welteny/CHACAL | welteny | "2024-04-09T19:24:57Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T19:17:17Z" | ---
license: openrail
---
|
communityai/akjindal53244___Arithmo-Data-50k | communityai | "2024-04-09T19:19:54Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T19:19:48Z" | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 112707051.53117467
num_examples: 50000
download_size: 45840156
dataset_size: 112707051.53117467
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Odeusys/llama-email-finetune-1k | Odeusys | "2024-04-09T20:57:54Z" | 0 | 0 | [
"license:mit",
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T19:19:59Z" | ---
license: mit
dataset_info:
features:
- name: Prompt
dtype: string
splits:
- name: train
num_bytes: 1321000
num_examples: 1000
download_size: 536754
dataset_size: 1321000
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
devrim/dmd_cifar10_edm_distillation_dataset | devrim | "2024-05-01T16:01:45Z" | 0 | 0 | [
"license:cc-by-nc-sa-4.0",
"region:us"
] | null | "2024-04-09T19:22:57Z" | ---
license: cc-by-nc-sa-4.0
---
|
nerfstudioteam/datasets | nerfstudioteam | "2024-04-09T19:59:47Z" | 0 | 0 | [
"task_categories:image-to-3d",
"size_categories:1K<n<10K",
"format:imagefolder",
"modality:3d",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | [
"image-to-3d"
] | "2024-04-09T19:24:24Z" | ---
task_categories:
- image-to-3d
--- |
ipipan/silesian-wikipedia-clean-20230901 | ipipan | "2024-05-24T15:37:08Z" | 0 | 0 | [
"license:cc-by-sa-4.0",
"size_categories:1K<n<10K",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-04-09T19:26:50Z" | ---
license: cc-by-sa-4.0
---
# Model Card for Clean Silesian Wikipedia
This is a cleaned and filtered snapshot of 20230901 Silesian Wikipedia.
## License
CC BY-SA 4.0
## Citation
If you use this model, please cite the following paper:
```
@inproceedings{rybak-2024-transferring-bert,
title = "Transferring {BERT} Capabilities from High-Resource to Low-Resource Languages Using Vocabulary Matching",
author = "Rybak, Piotr",
editor = "Calzolari, Nicoletta and
Kan, Min-Yen and
Hoste, Veronique and
Lenci, Alessandro and
Sakti, Sakriani and
Xue, Nianwen",
booktitle = "Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)",
month = may,
year = "2024",
address = "Torino, Italia",
publisher = "ELRA and ICCL",
url = "https://aclanthology.org/2024.lrec-main.1456",
pages = "16745--16750",
abstract = "Pre-trained language models have revolutionized the natural language understanding landscape, most notably BERT (Bidirectional Encoder Representations from Transformers). However, a significant challenge remains for low-resource languages, where limited data hinders the effective training of such models. This work presents a novel approach to bridge this gap by transferring BERT capabilities from high-resource to low-resource languages using vocabulary matching. We conduct experiments on the Silesian and Kashubian languages and demonstrate the effectiveness of our approach to improve the performance of BERT models even when the target language has minimal training data. Our results highlight the potential of the proposed technique to effectively train BERT models for low-resource languages, thus democratizing access to advanced language understanding models.",
}
```
## Authors
The model was created by Piotr Rybak from [Linguistic Engineering Group at Institute of Computer Science, Polish Academy of Sciences](http://zil.ipipan.waw.pl/).
This work was supported by the European Regional Development Fund as a part of 2014–2020 Smart Growth Operational Programme, CLARIN — Common Language Resources and Technology Infrastructure, project no. POIR.04.02.00-00C002/19.
|
nblinh63/twitter_dataset_1712691123 | nblinh63 | "2024-04-09T19:49:52Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T19:32:03Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 79864
num_examples: 200
download_size: 38018
dataset_size: 79864
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
communityai/gretelai___synthetic_text_to_sql-20k | communityai | "2024-04-09T19:43:09Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T19:43:06Z" | ---
dataset_info:
features:
- name: source
dtype: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 16860322.2
num_examples: 20000
download_size: 6011892
dataset_size: 16860322.2
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
humane-intelligence/defcon34-ai-village-redteam | humane-intelligence | "2024-04-09T21:04:32Z" | 0 | 2 | [
"language:en",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T19:46:22Z" | ---
language:
- en
pretty_name: AI Village Defcon Dataset
---
This is the dataset from the [AI Village red teaming competition](https://aivillage.org/defcon31/), held at [DEF CON](https://defcon.org/) 31.
More details [here](https://aivillage.org/generative%20red%20team/generative-red-team/) |
nblinh63/twitter_dataset_1712692454 | nblinh63 | "2024-04-09T20:11:05Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-04-09T19:54:15Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: tweet_content
dtype: string
- name: user_name
dtype: string
- name: user_id
dtype: string
- name: created_at
dtype: string
- name: url
dtype: string
- name: favourite_count
dtype: int64
- name: scraped_at
dtype: string
- name: image_urls
dtype: string
splits:
- name: train
num_bytes: 79978
num_examples: 200
download_size: 37819
dataset_size: 79978
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|