datasetId
stringlengths 5
121
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
28.8M
| likes
int64 0
5.87k
| tags
sequencelengths 1
7.92k
| task_categories
sequencelengths 0
40
⌀ | createdAt
unknown | card
stringlengths 19
977k
|
---|---|---|---|---|---|---|---|---|
open-llm-leaderboard-old/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha | open-llm-leaderboard-old | "2024-03-09T20:00:43Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:00:22Z" | ---
pretty_name: Evaluation run of jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha](https://huggingface.co/jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T19:58:29.731575](https://huggingface.co/datasets/open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha/blob/main/results_2024-03-09T19-58-29.731575.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.26200232497358983,\n\
\ \"acc_stderr\": 0.030975360992871986,\n \"acc_norm\": 0.26329067860989236,\n\
\ \"acc_norm_stderr\": 0.03176130492930688,\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4051814653873621,\n\
\ \"mc2_stderr\": 0.014454247268086058\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29948805460750855,\n \"acc_stderr\": 0.013385021637313563,\n\
\ \"acc_norm\": 0.32764505119453924,\n \"acc_norm_stderr\": 0.013715847940719346\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4182433778131846,\n\
\ \"acc_stderr\": 0.0049226246369452435,\n \"acc_norm\": 0.5377414857598088,\n\
\ \"acc_norm_stderr\": 0.004975546018950675\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909284,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909284\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.0391545063041425,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.0391545063041425\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.34868421052631576,\n \"acc_stderr\": 0.03878139888797609,\n\
\ \"acc_norm\": 0.34868421052631576,\n \"acc_norm_stderr\": 0.03878139888797609\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.21132075471698114,\n \"acc_stderr\": 0.025125766484827845,\n\
\ \"acc_norm\": 0.21132075471698114,\n \"acc_norm_stderr\": 0.025125766484827845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.19444444444444445,\n\
\ \"acc_stderr\": 0.03309615177059008,\n \"acc_norm\": 0.19444444444444445,\n\
\ \"acc_norm_stderr\": 0.03309615177059008\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.031862098516411426,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.031862098516411426\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.20851063829787234,\n \"acc_stderr\": 0.026556982117838746,\n\
\ \"acc_norm\": 0.20851063829787234,\n \"acc_norm_stderr\": 0.026556982117838746\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.23448275862068965,\n \"acc_stderr\": 0.035306258743465914,\n\
\ \"acc_norm\": 0.23448275862068965,\n \"acc_norm_stderr\": 0.035306258743465914\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1984126984126984,\n\
\ \"acc_stderr\": 0.03567016675276864,\n \"acc_norm\": 0.1984126984126984,\n\
\ \"acc_norm_stderr\": 0.03567016675276864\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.27741935483870966,\n\
\ \"acc_stderr\": 0.025470196835900055,\n \"acc_norm\": 0.27741935483870966,\n\
\ \"acc_norm_stderr\": 0.025470196835900055\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.03178529710642749,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.03178529710642749\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\"\
: 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365904,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365904\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.24871794871794872,\n \"acc_stderr\": 0.0219169577092138,\n \
\ \"acc_norm\": 0.24871794871794872,\n \"acc_norm_stderr\": 0.0219169577092138\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380572,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380572\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24954128440366974,\n \"acc_stderr\": 0.018553897629501624,\n \"\
acc_norm\": 0.24954128440366974,\n \"acc_norm_stderr\": 0.018553897629501624\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.31223628691983124,\n \"acc_stderr\": 0.030165137867847008,\n \
\ \"acc_norm\": 0.31223628691983124,\n \"acc_norm_stderr\": 0.030165137867847008\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.23318385650224216,\n\
\ \"acc_stderr\": 0.02838039114709472,\n \"acc_norm\": 0.23318385650224216,\n\
\ \"acc_norm_stderr\": 0.02838039114709472\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.19083969465648856,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.19083969465648856,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2975206611570248,\n \"acc_stderr\": 0.04173349148083499,\n \"\
acc_norm\": 0.2975206611570248,\n \"acc_norm_stderr\": 0.04173349148083499\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3067484662576687,\n \"acc_stderr\": 0.036230899157241474,\n\
\ \"acc_norm\": 0.3067484662576687,\n \"acc_norm_stderr\": 0.036230899157241474\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.1875,\n\
\ \"acc_stderr\": 0.0370468111477387,\n \"acc_norm\": 0.1875,\n \
\ \"acc_norm_stderr\": 0.0370468111477387\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.2621359223300971,\n \"acc_stderr\": 0.04354631077260595,\n\
\ \"acc_norm\": 0.2621359223300971,\n \"acc_norm_stderr\": 0.04354631077260595\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.02812096650391439,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.02812096650391439\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26436781609195403,\n\
\ \"acc_stderr\": 0.01576998484069052,\n \"acc_norm\": 0.26436781609195403,\n\
\ \"acc_norm_stderr\": 0.01576998484069052\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553995,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553995\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2990353697749196,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.2990353697749196,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.24691358024691357,\n \"acc_stderr\": 0.023993501709042096,\n\
\ \"acc_norm\": 0.24691358024691357,\n \"acc_norm_stderr\": 0.023993501709042096\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.25177304964539005,\n \"acc_stderr\": 0.0258921511567094,\n \
\ \"acc_norm\": 0.25177304964539005,\n \"acc_norm_stderr\": 0.0258921511567094\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23859191655801826,\n\
\ \"acc_stderr\": 0.010885929742002223,\n \"acc_norm\": 0.23859191655801826,\n\
\ \"acc_norm_stderr\": 0.010885929742002223\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.3014705882352941,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.3014705882352941,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26143790849673204,\n \"acc_stderr\": 0.017776947157528044,\n \
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.017776947157528044\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.20909090909090908,\n\
\ \"acc_stderr\": 0.03895091015724137,\n \"acc_norm\": 0.20909090909090908,\n\
\ \"acc_norm_stderr\": 0.03895091015724137\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24081632653061225,\n \"acc_stderr\": 0.027372942201788163,\n\
\ \"acc_norm\": 0.24081632653061225,\n \"acc_norm_stderr\": 0.027372942201788163\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.30845771144278605,\n\
\ \"acc_stderr\": 0.032658195885126994,\n \"acc_norm\": 0.30845771144278605,\n\
\ \"acc_norm_stderr\": 0.032658195885126994\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.19879518072289157,\n\
\ \"acc_stderr\": 0.031069390260789437,\n \"acc_norm\": 0.19879518072289157,\n\
\ \"acc_norm_stderr\": 0.031069390260789437\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.21052631578947367,\n \"acc_stderr\": 0.0312678171466318,\n\
\ \"acc_norm\": 0.21052631578947367,\n \"acc_norm_stderr\": 0.0312678171466318\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2386780905752754,\n\
\ \"mc1_stderr\": 0.014922629695456418,\n \"mc2\": 0.4051814653873621,\n\
\ \"mc2_stderr\": 0.014454247268086058\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5895816890292028,\n \"acc_stderr\": 0.013825107120035861\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.006065200909780136,\n \
\ \"acc_stderr\": 0.0021386703014604483\n }\n}\n```"
repo_url: https://huggingface.co/jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T19-58-29.731575.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- '**/details_harness|winogrande|5_2024-03-09T19-58-29.731575.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T19-58-29.731575.parquet'
- config_name: results
data_files:
- split: 2024_03_09T19_58_29.731575
path:
- results_2024-03-09T19-58-29.731575.parquet
- split: latest
path:
- results_2024-03-09T19-58-29.731575.parquet
---
# Dataset Card for Evaluation run of jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha](https://huggingface.co/jeff31415/TinyLlama-1.1B-1.5T-OpenOrca-Alpha) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T19:58:29.731575](https://huggingface.co/datasets/open-llm-leaderboard/details_jeff31415__TinyLlama-1.1B-1.5T-OpenOrca-Alpha/blob/main/results_2024-03-09T19-58-29.731575.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.26200232497358983,
"acc_stderr": 0.030975360992871986,
"acc_norm": 0.26329067860989236,
"acc_norm_stderr": 0.03176130492930688,
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4051814653873621,
"mc2_stderr": 0.014454247268086058
},
"harness|arc:challenge|25": {
"acc": 0.29948805460750855,
"acc_stderr": 0.013385021637313563,
"acc_norm": 0.32764505119453924,
"acc_norm_stderr": 0.013715847940719346
},
"harness|hellaswag|10": {
"acc": 0.4182433778131846,
"acc_stderr": 0.0049226246369452435,
"acc_norm": 0.5377414857598088,
"acc_norm_stderr": 0.004975546018950675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.0391545063041425,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.0391545063041425
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.34868421052631576,
"acc_stderr": 0.03878139888797609,
"acc_norm": 0.34868421052631576,
"acc_norm_stderr": 0.03878139888797609
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.21132075471698114,
"acc_stderr": 0.025125766484827845,
"acc_norm": 0.21132075471698114,
"acc_norm_stderr": 0.025125766484827845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.19444444444444445,
"acc_stderr": 0.03309615177059008,
"acc_norm": 0.19444444444444445,
"acc_norm_stderr": 0.03309615177059008
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.031862098516411426,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.031862098516411426
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.20851063829787234,
"acc_stderr": 0.026556982117838746,
"acc_norm": 0.20851063829787234,
"acc_norm_stderr": 0.026556982117838746
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.23448275862068965,
"acc_stderr": 0.035306258743465914,
"acc_norm": 0.23448275862068965,
"acc_norm_stderr": 0.035306258743465914
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1984126984126984,
"acc_stderr": 0.03567016675276864,
"acc_norm": 0.1984126984126984,
"acc_norm_stderr": 0.03567016675276864
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.27741935483870966,
"acc_stderr": 0.025470196835900055,
"acc_norm": 0.27741935483870966,
"acc_norm_stderr": 0.025470196835900055
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.03178529710642749,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.03178529710642749
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365904,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365904
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.24871794871794872,
"acc_stderr": 0.0219169577092138,
"acc_norm": 0.24871794871794872,
"acc_norm_stderr": 0.0219169577092138
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380572,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380572
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24954128440366974,
"acc_stderr": 0.018553897629501624,
"acc_norm": 0.24954128440366974,
"acc_norm_stderr": 0.018553897629501624
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.31223628691983124,
"acc_stderr": 0.030165137867847008,
"acc_norm": 0.31223628691983124,
"acc_norm_stderr": 0.030165137867847008
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.23318385650224216,
"acc_stderr": 0.02838039114709472,
"acc_norm": 0.23318385650224216,
"acc_norm_stderr": 0.02838039114709472
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.19083969465648856,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.19083969465648856,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2975206611570248,
"acc_stderr": 0.04173349148083499,
"acc_norm": 0.2975206611570248,
"acc_norm_stderr": 0.04173349148083499
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3067484662576687,
"acc_stderr": 0.036230899157241474,
"acc_norm": 0.3067484662576687,
"acc_norm_stderr": 0.036230899157241474
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.1875,
"acc_stderr": 0.0370468111477387,
"acc_norm": 0.1875,
"acc_norm_stderr": 0.0370468111477387
},
"harness|hendrycksTest-management|5": {
"acc": 0.2621359223300971,
"acc_stderr": 0.04354631077260595,
"acc_norm": 0.2621359223300971,
"acc_norm_stderr": 0.04354631077260595
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.02812096650391439,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.02812096650391439
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26436781609195403,
"acc_stderr": 0.01576998484069052,
"acc_norm": 0.26436781609195403,
"acc_norm_stderr": 0.01576998484069052
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553995,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553995
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2990353697749196,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.2990353697749196,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.24691358024691357,
"acc_stderr": 0.023993501709042096,
"acc_norm": 0.24691358024691357,
"acc_norm_stderr": 0.023993501709042096
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.25177304964539005,
"acc_stderr": 0.0258921511567094,
"acc_norm": 0.25177304964539005,
"acc_norm_stderr": 0.0258921511567094
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23859191655801826,
"acc_stderr": 0.010885929742002223,
"acc_norm": 0.23859191655801826,
"acc_norm_stderr": 0.010885929742002223
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.3014705882352941,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.3014705882352941,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.017776947157528044,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.017776947157528044
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.20909090909090908,
"acc_stderr": 0.03895091015724137,
"acc_norm": 0.20909090909090908,
"acc_norm_stderr": 0.03895091015724137
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24081632653061225,
"acc_stderr": 0.027372942201788163,
"acc_norm": 0.24081632653061225,
"acc_norm_stderr": 0.027372942201788163
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.30845771144278605,
"acc_stderr": 0.032658195885126994,
"acc_norm": 0.30845771144278605,
"acc_norm_stderr": 0.032658195885126994
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.19879518072289157,
"acc_stderr": 0.031069390260789437,
"acc_norm": 0.19879518072289157,
"acc_norm_stderr": 0.031069390260789437
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0312678171466318,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0312678171466318
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2386780905752754,
"mc1_stderr": 0.014922629695456418,
"mc2": 0.4051814653873621,
"mc2_stderr": 0.014454247268086058
},
"harness|winogrande|5": {
"acc": 0.5895816890292028,
"acc_stderr": 0.013825107120035861
},
"harness|gsm8k|5": {
"acc": 0.006065200909780136,
"acc_stderr": 0.0021386703014604483
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_MaziyarPanahi__Calme-7B-Instruct-v0.1 | open-llm-leaderboard-old | "2024-03-09T20:16:54Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:16:32Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Calme-7B-Instruct-v0.1](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:14:19.048568](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1/blob/main/results_2024-03-09T20-14-19.048568.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6533857822231156,\n\
\ \"acc_stderr\": 0.031950950731080195,\n \"acc_norm\": 0.6535689315699622,\n\
\ \"acc_norm_stderr\": 0.03261503340648672,\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.59377238996997,\n\
\ \"mc2_stderr\": 0.015205340401893556\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175458,\n\
\ \"acc_norm\": 0.6723549488054608,\n \"acc_norm_stderr\": 0.013715847940719337\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6585341565425215,\n\
\ \"acc_stderr\": 0.004732322172153749,\n \"acc_norm\": 0.8557060346544513,\n\
\ \"acc_norm_stderr\": 0.0035066942243475725\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7302631578947368,\n \"acc_stderr\": 0.03611780560284898,\n\
\ \"acc_norm\": 0.7302631578947368,\n \"acc_norm_stderr\": 0.03611780560284898\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741702,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741702\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620332,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620332\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778408,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778408\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04472135954999579,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04472135954999579\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554952,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554952\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945633,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945633\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6692307692307692,\n \"acc_stderr\": 0.02385479568097112,\n \
\ \"acc_norm\": 0.6692307692307692,\n \"acc_norm_stderr\": 0.02385479568097112\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634332,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634332\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.015094215699700481,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.015094215699700481\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.021901905115073332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.021901905115073332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8275862068965517,\n\
\ \"acc_stderr\": 0.013507943909371803,\n \"acc_norm\": 0.8275862068965517,\n\
\ \"acc_norm_stderr\": 0.013507943909371803\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3675977653631285,\n\
\ \"acc_stderr\": 0.016125543823552947,\n \"acc_norm\": 0.3675977653631285,\n\
\ \"acc_norm_stderr\": 0.016125543823552947\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.02977945095730307,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.02977945095730307\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4595827900912647,\n\
\ \"acc_stderr\": 0.012728446067669968,\n \"acc_norm\": 0.4595827900912647,\n\
\ \"acc_norm_stderr\": 0.012728446067669968\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128448,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128448\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8507462686567164,\n\
\ \"acc_stderr\": 0.02519692987482707,\n \"acc_norm\": 0.8507462686567164,\n\
\ \"acc_norm_stderr\": 0.02519692987482707\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4320685434516524,\n\
\ \"mc1_stderr\": 0.017341202394988257,\n \"mc2\": 0.59377238996997,\n\
\ \"mc2_stderr\": 0.015205340401893556\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8334648776637726,\n \"acc_stderr\": 0.010470796496781093\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \
\ \"acc_stderr\": 0.0127056857231317\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-14-19.048568.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-14-19.048568.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- '**/details_harness|winogrande|5_2024-03-09T20-14-19.048568.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-14-19.048568.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_14_19.048568
path:
- results_2024-03-09T20-14-19.048568.parquet
- split: latest
path:
- results_2024-03-09T20-14-19.048568.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Calme-7B-Instruct-v0.1](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:14:19.048568](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1/blob/main/results_2024-03-09T20-14-19.048568.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6533857822231156,
"acc_stderr": 0.031950950731080195,
"acc_norm": 0.6535689315699622,
"acc_norm_stderr": 0.03261503340648672,
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.59377238996997,
"mc2_stderr": 0.015205340401893556
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175458,
"acc_norm": 0.6723549488054608,
"acc_norm_stderr": 0.013715847940719337
},
"harness|hellaswag|10": {
"acc": 0.6585341565425215,
"acc_stderr": 0.004732322172153749,
"acc_norm": 0.8557060346544513,
"acc_norm_stderr": 0.0035066942243475725
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7302631578947368,
"acc_stderr": 0.03611780560284898,
"acc_norm": 0.7302631578947368,
"acc_norm_stderr": 0.03611780560284898
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741702,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741702
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778408,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778408
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5,
"acc_stderr": 0.04472135954999579,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04472135954999579
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554952,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554952
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945633,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945633
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6692307692307692,
"acc_stderr": 0.02385479568097112,
"acc_norm": 0.6692307692307692,
"acc_norm_stderr": 0.02385479568097112
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634332,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634332
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.015094215699700481,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.015094215699700481
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.021901905115073332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.021901905115073332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8275862068965517,
"acc_stderr": 0.013507943909371803,
"acc_norm": 0.8275862068965517,
"acc_norm_stderr": 0.013507943909371803
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3675977653631285,
"acc_stderr": 0.016125543823552947,
"acc_norm": 0.3675977653631285,
"acc_norm_stderr": 0.016125543823552947
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.02977945095730307,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.02977945095730307
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4595827900912647,
"acc_stderr": 0.012728446067669968,
"acc_norm": 0.4595827900912647,
"acc_norm_stderr": 0.012728446067669968
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128448,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128448
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8507462686567164,
"acc_stderr": 0.02519692987482707,
"acc_norm": 0.8507462686567164,
"acc_norm_stderr": 0.02519692987482707
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4320685434516524,
"mc1_stderr": 0.017341202394988257,
"mc2": 0.59377238996997,
"mc2_stderr": 0.015205340401893556
},
"harness|winogrande|5": {
"acc": 0.8334648776637726,
"acc_stderr": 0.010470796496781093
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.0127056857231317
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_jefferylovely__MoeLovely-13B | open-llm-leaderboard-old | "2024-03-09T20:18:02Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:17:41Z" | ---
pretty_name: Evaluation run of jefferylovely/MoeLovely-13B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jefferylovely/MoeLovely-13B](https://huggingface.co/jefferylovely/MoeLovely-13B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jefferylovely__MoeLovely-13B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:15:16.888132](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__MoeLovely-13B/blob/main/results_2024-03-09T20-15-16.888132.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6546902728965655,\n\
\ \"acc_stderr\": 0.03211718616098461,\n \"acc_norm\": 0.6535419961890351,\n\
\ \"acc_norm_stderr\": 0.03280312459827673,\n \"mc1\": 0.6376988984088128,\n\
\ \"mc1_stderr\": 0.01682664689726226,\n \"mc2\": 0.7873533609473717,\n\
\ \"mc2_stderr\": 0.01360915944260026\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7073378839590444,\n \"acc_stderr\": 0.013295916103619423,\n\
\ \"acc_norm\": 0.7372013651877133,\n \"acc_norm_stderr\": 0.012862523175351333\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7344154550886277,\n\
\ \"acc_stderr\": 0.004407413723383404,\n \"acc_norm\": 0.8949412467635929,\n\
\ \"acc_norm_stderr\": 0.003060024474796982\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.041539484047423976,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.041539484047423976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.036146654241808254,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.036146654241808254\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768077,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768077\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.047036043419179864,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.047036043419179864\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4365079365079365,\n \"acc_stderr\": 0.025542846817400496,\n \"\
acc_norm\": 0.4365079365079365,\n \"acc_norm_stderr\": 0.025542846817400496\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7806451612903226,\n\
\ \"acc_stderr\": 0.023540799358723295,\n \"acc_norm\": 0.7806451612903226,\n\
\ \"acc_norm_stderr\": 0.023540799358723295\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.02380763319865727,\n \
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.02380763319865727\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251972,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251972\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.030283995525884396,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5277777777777778,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.5277777777777778,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n\
\ \"acc_stderr\": 0.026756401538078962,\n \"acc_norm\": 0.8235294117647058,\n\
\ \"acc_norm_stderr\": 0.026756401538078962\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8016877637130801,\n \"acc_stderr\": 0.02595502084162113,\n\
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.02595502084162113\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4017857142857143,\n\
\ \"acc_stderr\": 0.04653333146973646,\n \"acc_norm\": 0.4017857142857143,\n\
\ \"acc_norm_stderr\": 0.04653333146973646\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406974,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406974\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608306,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608306\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.46145251396648046,\n\
\ \"acc_stderr\": 0.01667273126755226,\n \"acc_norm\": 0.46145251396648046,\n\
\ \"acc_norm_stderr\": 0.01667273126755226\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7189542483660131,\n \"acc_stderr\": 0.025738854797818733,\n\
\ \"acc_norm\": 0.7189542483660131,\n \"acc_norm_stderr\": 0.025738854797818733\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.024836057868294677,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.024836057868294677\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5177304964539007,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.5177304964539007,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47196870925684486,\n\
\ \"acc_stderr\": 0.012750151802922436,\n \"acc_norm\": 0.47196870925684486,\n\
\ \"acc_norm_stderr\": 0.012750151802922436\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396556,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396556\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685515,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685515\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6376988984088128,\n\
\ \"mc1_stderr\": 0.01682664689726226,\n \"mc2\": 0.7873533609473717,\n\
\ \"mc2_stderr\": 0.01360915944260026\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8760852407261247,\n \"acc_stderr\": 0.009260146295063712\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6914329037149356,\n \
\ \"acc_stderr\": 0.012723076049815896\n }\n}\n```"
repo_url: https://huggingface.co/jefferylovely/MoeLovely-13B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-15-16.888132.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-15-16.888132.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- '**/details_harness|winogrande|5_2024-03-09T20-15-16.888132.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-15-16.888132.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_15_16.888132
path:
- results_2024-03-09T20-15-16.888132.parquet
- split: latest
path:
- results_2024-03-09T20-15-16.888132.parquet
---
# Dataset Card for Evaluation run of jefferylovely/MoeLovely-13B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jefferylovely/MoeLovely-13B](https://huggingface.co/jefferylovely/MoeLovely-13B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jefferylovely__MoeLovely-13B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:15:16.888132](https://huggingface.co/datasets/open-llm-leaderboard/details_jefferylovely__MoeLovely-13B/blob/main/results_2024-03-09T20-15-16.888132.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6546902728965655,
"acc_stderr": 0.03211718616098461,
"acc_norm": 0.6535419961890351,
"acc_norm_stderr": 0.03280312459827673,
"mc1": 0.6376988984088128,
"mc1_stderr": 0.01682664689726226,
"mc2": 0.7873533609473717,
"mc2_stderr": 0.01360915944260026
},
"harness|arc:challenge|25": {
"acc": 0.7073378839590444,
"acc_stderr": 0.013295916103619423,
"acc_norm": 0.7372013651877133,
"acc_norm_stderr": 0.012862523175351333
},
"harness|hellaswag|10": {
"acc": 0.7344154550886277,
"acc_stderr": 0.004407413723383404,
"acc_norm": 0.8949412467635929,
"acc_norm_stderr": 0.003060024474796982
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.041539484047423976,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.041539484047423976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.036146654241808254,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.036146654241808254
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.025542846817400496,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.025542846817400496
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.02380763319865727,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.02380763319865727
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251972,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251972
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078962,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078962
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.02595502084162113,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.02595502084162113
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4017857142857143,
"acc_stderr": 0.04653333146973646,
"acc_norm": 0.4017857142857143,
"acc_norm_stderr": 0.04653333146973646
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406974,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406974
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608306,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608306
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.46145251396648046,
"acc_stderr": 0.01667273126755226,
"acc_norm": 0.46145251396648046,
"acc_norm_stderr": 0.01667273126755226
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7189542483660131,
"acc_stderr": 0.025738854797818733,
"acc_norm": 0.7189542483660131,
"acc_norm_stderr": 0.025738854797818733
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.024836057868294677,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.024836057868294677
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5177304964539007,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.5177304964539007,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47196870925684486,
"acc_stderr": 0.012750151802922436,
"acc_norm": 0.47196870925684486,
"acc_norm_stderr": 0.012750151802922436
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396556,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396556
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685515,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685515
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6376988984088128,
"mc1_stderr": 0.01682664689726226,
"mc2": 0.7873533609473717,
"mc2_stderr": 0.01360915944260026
},
"harness|winogrande|5": {
"acc": 0.8760852407261247,
"acc_stderr": 0.009260146295063712
},
"harness|gsm8k|5": {
"acc": 0.6914329037149356,
"acc_stderr": 0.012723076049815896
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_MaziyarPanahi__Calme-7B-Instruct-v0.1.1 | open-llm-leaderboard-old | "2024-03-09T20:25:53Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:25:32Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.1.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Calme-7B-Instruct-v0.1.1](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.1.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:23:11.432863](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1.1/blob/main/results_2024-03-09T20-23-11.432863.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6498252661813451,\n\
\ \"acc_stderr\": 0.032055331163985706,\n \"acc_norm\": 0.6489309523100264,\n\
\ \"acc_norm_stderr\": 0.03272915880969551,\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7809684951078746,\n\
\ \"mc2_stderr\": 0.01367755228171902\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.712457337883959,\n \"acc_stderr\": 0.013226719056266127,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7189802828121888,\n\
\ \"acc_stderr\": 0.004485784468576664,\n \"acc_norm\": 0.8925512846046604,\n\
\ \"acc_norm_stderr\": 0.003090499801090435\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6564102564102564,\n \"acc_stderr\": 0.024078696580635477,\n\
\ \"acc_norm\": 0.6564102564102564,\n \"acc_norm_stderr\": 0.024078696580635477\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.028317533496066485,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.028317533496066485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8440366972477065,\n \"acc_stderr\": 0.01555580271359017,\n \"\
acc_norm\": 0.8440366972477065,\n \"acc_norm_stderr\": 0.01555580271359017\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4212290502793296,\n\
\ \"acc_stderr\": 0.016513676031179595,\n \"acc_norm\": 0.4212290502793296,\n\
\ \"acc_norm_stderr\": 0.016513676031179595\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.024383665531035454,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.024383665531035454\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523365,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523365\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.025870646766169136,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.025870646766169136\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6340269277845777,\n\
\ \"mc1_stderr\": 0.016862941684088386,\n \"mc2\": 0.7809684951078746,\n\
\ \"mc2_stderr\": 0.01367755228171902\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8516179952644041,\n \"acc_stderr\": 0.00999070600518414\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6914329037149356,\n \
\ \"acc_stderr\": 0.01272307604981591\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.1.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-23-11.432863.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-23-11.432863.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- '**/details_harness|winogrande|5_2024-03-09T20-23-11.432863.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-23-11.432863.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_23_11.432863
path:
- results_2024-03-09T20-23-11.432863.parquet
- split: latest
path:
- results_2024-03-09T20-23-11.432863.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Calme-7B-Instruct-v0.1.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Calme-7B-Instruct-v0.1.1](https://huggingface.co/MaziyarPanahi/Calme-7B-Instruct-v0.1.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:23:11.432863](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Calme-7B-Instruct-v0.1.1/blob/main/results_2024-03-09T20-23-11.432863.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6498252661813451,
"acc_stderr": 0.032055331163985706,
"acc_norm": 0.6489309523100264,
"acc_norm_stderr": 0.03272915880969551,
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7809684951078746,
"mc2_stderr": 0.01367755228171902
},
"harness|arc:challenge|25": {
"acc": 0.712457337883959,
"acc_stderr": 0.013226719056266127,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7189802828121888,
"acc_stderr": 0.004485784468576664,
"acc_norm": 0.8925512846046604,
"acc_norm_stderr": 0.003090499801090435
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6564102564102564,
"acc_stderr": 0.024078696580635477,
"acc_norm": 0.6564102564102564,
"acc_norm_stderr": 0.024078696580635477
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.028317533496066485,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.028317533496066485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8440366972477065,
"acc_stderr": 0.01555580271359017,
"acc_norm": 0.8440366972477065,
"acc_norm_stderr": 0.01555580271359017
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4212290502793296,
"acc_stderr": 0.016513676031179595,
"acc_norm": 0.4212290502793296,
"acc_norm_stderr": 0.016513676031179595
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.024383665531035454,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.024383665531035454
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523365,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523365
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.025870646766169136,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.025870646766169136
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6340269277845777,
"mc1_stderr": 0.016862941684088386,
"mc2": 0.7809684951078746,
"mc2_stderr": 0.01367755228171902
},
"harness|winogrande|5": {
"acc": 0.8516179952644041,
"acc_stderr": 0.00999070600518414
},
"harness|gsm8k|5": {
"acc": 0.6914329037149356,
"acc_stderr": 0.01272307604981591
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101 | open-llm-leaderboard-old | "2024-03-09T20:29:21Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:29:00Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-Aya-101
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Mistral-7B-Instruct-Aya-101](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-Aya-101)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:26:40.729869](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101/blob/main/results_2024-03-09T20-26-40.729869.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6172980252936014,\n\
\ \"acc_stderr\": 0.032867077607959115,\n \"acc_norm\": 0.6226740310135067,\n\
\ \"acc_norm_stderr\": 0.03353287092767944,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5270943122503622,\n\
\ \"mc2_stderr\": 0.015098813242240394\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5477815699658704,\n \"acc_stderr\": 0.014544519880633823,\n\
\ \"acc_norm\": 0.591296928327645,\n \"acc_norm_stderr\": 0.014365750345426998\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6266679944234216,\n\
\ \"acc_stderr\": 0.004827006520802887,\n \"acc_norm\": 0.8320055765783708,\n\
\ \"acc_norm_stderr\": 0.003730972670511862\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.038990736873573344,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.038990736873573344\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404947,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816505,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816505\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.42105263157894735,\n\
\ \"acc_stderr\": 0.046446020912223177,\n \"acc_norm\": 0.42105263157894735,\n\
\ \"acc_norm_stderr\": 0.046446020912223177\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3862433862433862,\n \"acc_stderr\": 0.025075981767601684,\n \"\
acc_norm\": 0.3862433862433862,\n \"acc_norm_stderr\": 0.025075981767601684\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6870967741935484,\n\
\ \"acc_stderr\": 0.02637756702864586,\n \"acc_norm\": 0.6870967741935484,\n\
\ \"acc_norm_stderr\": 0.02637756702864586\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.03515895551165698,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.03515895551165698\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8393782383419689,\n \"acc_stderr\": 0.026499057701397467,\n\
\ \"acc_norm\": 0.8393782383419689,\n \"acc_norm_stderr\": 0.026499057701397467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.024784316942156395,\n\
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.024784316942156395\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886783,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886783\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.818348623853211,\n \"acc_stderr\": 0.016530617409266864,\n \"\
acc_norm\": 0.818348623853211,\n \"acc_norm_stderr\": 0.016530617409266864\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588667,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588667\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.759493670886076,\n \"acc_stderr\": 0.02782078198114969,\n \
\ \"acc_norm\": 0.759493670886076,\n \"acc_norm_stderr\": 0.02782078198114969\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6322869955156951,\n\
\ \"acc_stderr\": 0.03236198350928275,\n \"acc_norm\": 0.6322869955156951,\n\
\ \"acc_norm_stderr\": 0.03236198350928275\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909476,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909476\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.04669510663875191,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.04669510663875191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.905982905982906,\n\
\ \"acc_stderr\": 0.019119892798924985,\n \"acc_norm\": 0.905982905982906,\n\
\ \"acc_norm_stderr\": 0.019119892798924985\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946005,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946005\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.025469770149400172,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.025469770149400172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172542,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172542\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.02545775669666788,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.02545775669666788\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42959582790091266,\n\
\ \"acc_stderr\": 0.012643004623790203,\n \"acc_norm\": 0.42959582790091266,\n\
\ \"acc_norm_stderr\": 0.012643004623790203\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6617647058823529,\n \"acc_stderr\": 0.028739328513983572,\n\
\ \"acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.028739328513983572\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.032357437893550424,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.032357437893550424\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.033799766898963086,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.033799766898963086\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5270943122503622,\n\
\ \"mc2_stderr\": 0.015098813242240394\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7750591949486977,\n \"acc_stderr\": 0.01173504356412673\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3912054586808188,\n \
\ \"acc_stderr\": 0.0134425024027943\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-Aya-101
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-26-40.729869.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- '**/details_harness|winogrande|5_2024-03-09T20-26-40.729869.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-26-40.729869.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_26_40.729869
path:
- results_2024-03-09T20-26-40.729869.parquet
- split: latest
path:
- results_2024-03-09T20-26-40.729869.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-Aya-101
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Mistral-7B-Instruct-Aya-101](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-Aya-101) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:26:40.729869](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-Aya-101/blob/main/results_2024-03-09T20-26-40.729869.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6172980252936014,
"acc_stderr": 0.032867077607959115,
"acc_norm": 0.6226740310135067,
"acc_norm_stderr": 0.03353287092767944,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5270943122503622,
"mc2_stderr": 0.015098813242240394
},
"harness|arc:challenge|25": {
"acc": 0.5477815699658704,
"acc_stderr": 0.014544519880633823,
"acc_norm": 0.591296928327645,
"acc_norm_stderr": 0.014365750345426998
},
"harness|hellaswag|10": {
"acc": 0.6266679944234216,
"acc_stderr": 0.004827006520802887,
"acc_norm": 0.8320055765783708,
"acc_norm_stderr": 0.003730972670511862
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.038990736873573344,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.038990736873573344
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404947,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.42105263157894735,
"acc_stderr": 0.046446020912223177,
"acc_norm": 0.42105263157894735,
"acc_norm_stderr": 0.046446020912223177
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3862433862433862,
"acc_stderr": 0.025075981767601684,
"acc_norm": 0.3862433862433862,
"acc_norm_stderr": 0.025075981767601684
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6870967741935484,
"acc_stderr": 0.02637756702864586,
"acc_norm": 0.6870967741935484,
"acc_norm_stderr": 0.02637756702864586
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.03515895551165698,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.03515895551165698
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8393782383419689,
"acc_stderr": 0.026499057701397467,
"acc_norm": 0.8393782383419689,
"acc_norm_stderr": 0.026499057701397467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.024784316942156395,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.024784316942156395
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886783,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886783
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.818348623853211,
"acc_stderr": 0.016530617409266864,
"acc_norm": 0.818348623853211,
"acc_norm_stderr": 0.016530617409266864
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588667,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588667
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.759493670886076,
"acc_stderr": 0.02782078198114969,
"acc_norm": 0.759493670886076,
"acc_norm_stderr": 0.02782078198114969
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6322869955156951,
"acc_stderr": 0.03236198350928275,
"acc_norm": 0.6322869955156951,
"acc_norm_stderr": 0.03236198350928275
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909476,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909476
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.04669510663875191,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.04669510663875191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.905982905982906,
"acc_stderr": 0.019119892798924985,
"acc_norm": 0.905982905982906,
"acc_norm_stderr": 0.019119892798924985
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946005,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946005
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.025469770149400172,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.025469770149400172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172542,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172542
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.02545775669666788,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.02545775669666788
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42959582790091266,
"acc_stderr": 0.012643004623790203,
"acc_norm": 0.42959582790091266,
"acc_norm_stderr": 0.012643004623790203
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.028739328513983572,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.028739328513983572
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.032357437893550424,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.032357437893550424
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.033799766898963086,
"acc_norm": 0.87,
"acc_norm_stderr": 0.033799766898963086
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5270943122503622,
"mc2_stderr": 0.015098813242240394
},
"harness|winogrande|5": {
"acc": 0.7750591949486977,
"acc_stderr": 0.01173504356412673
},
"harness|gsm8k|5": {
"acc": 0.3912054586808188,
"acc_stderr": 0.0134425024027943
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2 | open-llm-leaderboard-old | "2024-03-09T20:35:56Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:35:34Z" | ---
pretty_name: Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:33:17.443758](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2/blob/main/results_2024-03-09T20-33-17.443758.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.615849136003834,\n\
\ \"acc_stderr\": 0.032970270965001755,\n \"acc_norm\": 0.620423209333745,\n\
\ \"acc_norm_stderr\": 0.03363666991029327,\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6421524161247847,\n\
\ \"mc2_stderr\": 0.01506624561829692\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5776450511945392,\n \"acc_stderr\": 0.014434138713379977,\n\
\ \"acc_norm\": 0.6203071672354948,\n \"acc_norm_stderr\": 0.014182119866974872\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6369249153555069,\n\
\ \"acc_stderr\": 0.004799034356969391,\n \"acc_norm\": 0.8298147779326828,\n\
\ \"acc_norm_stderr\": 0.0037502741958275972\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.03942082639927213,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.03942082639927213\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.032469569197899575,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.032469569197899575\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.025138091388851105,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.025138091388851105\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.026069362295335134,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.026069362295335134\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.62,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\"\
: 0.62,\n \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306422,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306422\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5666666666666667,\n \"acc_stderr\": 0.025124653525885113,\n\
\ \"acc_norm\": 0.5666666666666667,\n \"acc_norm_stderr\": 0.025124653525885113\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8128440366972477,\n \"acc_stderr\": 0.01672268452620015,\n \"\
acc_norm\": 0.8128440366972477,\n \"acc_norm_stderr\": 0.01672268452620015\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690879,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690879\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179333,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7790549169859514,\n\
\ \"acc_stderr\": 0.014836205167333555,\n \"acc_norm\": 0.7790549169859514,\n\
\ \"acc_norm_stderr\": 0.014836205167333555\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6763005780346821,\n \"acc_stderr\": 0.025190181327608408,\n\
\ \"acc_norm\": 0.6763005780346821,\n \"acc_norm_stderr\": 0.025190181327608408\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.394413407821229,\n\
\ \"acc_stderr\": 0.01634538676210397,\n \"acc_norm\": 0.394413407821229,\n\
\ \"acc_norm_stderr\": 0.01634538676210397\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6759259259259259,\n \"acc_stderr\": 0.026041766202717156,\n\
\ \"acc_norm\": 0.6759259259259259,\n \"acc_norm_stderr\": 0.026041766202717156\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4439374185136897,\n\
\ \"acc_stderr\": 0.012689708167787687,\n \"acc_norm\": 0.4439374185136897,\n\
\ \"acc_norm_stderr\": 0.012689708167787687\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.02928941340940319,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.02928941340940319\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6176470588235294,\n \"acc_stderr\": 0.019659922493623354,\n \
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.019659922493623354\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7272727272727273,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.7272727272727273,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.0289205832206756,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.0289205832206756\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7910447761194029,\n\
\ \"acc_stderr\": 0.028748298931728655,\n \"acc_norm\": 0.7910447761194029,\n\
\ \"acc_norm_stderr\": 0.028748298931728655\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47368421052631576,\n\
\ \"mc1_stderr\": 0.017479241161975526,\n \"mc2\": 0.6421524161247847,\n\
\ \"mc2_stderr\": 0.01506624561829692\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7758484609313339,\n \"acc_stderr\": 0.011720400740774092\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42153146322971946,\n \
\ \"acc_stderr\": 0.013601824409483267\n }\n}\n```"
repo_url: https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-33-17.443758.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- '**/details_harness|winogrande|5_2024-03-09T20-33-17.443758.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-33-17.443758.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_33_17.443758
path:
- results_2024-03-09T20-33-17.443758.parquet
- split: latest
path:
- results_2024-03-09T20-33-17.443758.parquet
---
# Dataset Card for Evaluation run of MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2](https://huggingface.co/MaziyarPanahi/Mistral-7B-Instruct-KhanAcademy-v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:33:17.443758](https://huggingface.co/datasets/open-llm-leaderboard/details_MaziyarPanahi__Mistral-7B-Instruct-KhanAcademy-v0.2/blob/main/results_2024-03-09T20-33-17.443758.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.615849136003834,
"acc_stderr": 0.032970270965001755,
"acc_norm": 0.620423209333745,
"acc_norm_stderr": 0.03363666991029327,
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6421524161247847,
"mc2_stderr": 0.01506624561829692
},
"harness|arc:challenge|25": {
"acc": 0.5776450511945392,
"acc_stderr": 0.014434138713379977,
"acc_norm": 0.6203071672354948,
"acc_norm_stderr": 0.014182119866974872
},
"harness|hellaswag|10": {
"acc": 0.6369249153555069,
"acc_stderr": 0.004799034356969391,
"acc_norm": 0.8298147779326828,
"acc_norm_stderr": 0.0037502741958275972
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.03942082639927213,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.03942082639927213
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.032469569197899575,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.032469569197899575
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.025138091388851105,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.025138091388851105
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7,
"acc_stderr": 0.026069362295335134,
"acc_norm": 0.7,
"acc_norm_stderr": 0.026069362295335134
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.62,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.62,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306422,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306422
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5666666666666667,
"acc_stderr": 0.025124653525885113,
"acc_norm": 0.5666666666666667,
"acc_norm_stderr": 0.025124653525885113
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8128440366972477,
"acc_stderr": 0.01672268452620015,
"acc_norm": 0.8128440366972477,
"acc_norm_stderr": 0.01672268452620015
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690879,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690879
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179333,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7790549169859514,
"acc_stderr": 0.014836205167333555,
"acc_norm": 0.7790549169859514,
"acc_norm_stderr": 0.014836205167333555
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6763005780346821,
"acc_stderr": 0.025190181327608408,
"acc_norm": 0.6763005780346821,
"acc_norm_stderr": 0.025190181327608408
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.394413407821229,
"acc_stderr": 0.01634538676210397,
"acc_norm": 0.394413407821229,
"acc_norm_stderr": 0.01634538676210397
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.026041766202717156,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.026041766202717156
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4439374185136897,
"acc_stderr": 0.012689708167787687,
"acc_norm": 0.4439374185136897,
"acc_norm_stderr": 0.012689708167787687
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.02928941340940319,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.02928941340940319
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.019659922493623354,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.019659922493623354
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.0289205832206756,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.0289205832206756
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7910447761194029,
"acc_stderr": 0.028748298931728655,
"acc_norm": 0.7910447761194029,
"acc_norm_stderr": 0.028748298931728655
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47368421052631576,
"mc1_stderr": 0.017479241161975526,
"mc2": 0.6421524161247847,
"mc2_stderr": 0.01506624561829692
},
"harness|winogrande|5": {
"acc": 0.7758484609313339,
"acc_stderr": 0.011720400740774092
},
"harness|gsm8k|5": {
"acc": 0.42153146322971946,
"acc_stderr": 0.013601824409483267
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_jisukim8873__falcon-7B-case-8 | open-llm-leaderboard-old | "2024-03-09T20:52:50Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:52:28Z" | ---
pretty_name: Evaluation run of jisukim8873/falcon-7B-case-8
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [jisukim8873/falcon-7B-case-8](https://huggingface.co/jisukim8873/falcon-7B-case-8)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_jisukim8873__falcon-7B-case-8\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:50:47.085955](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__falcon-7B-case-8/blob/main/results_2024-03-09T20-50-47.085955.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.31319141324446226,\n\
\ \"acc_stderr\": 0.032438104383807814,\n \"acc_norm\": 0.3144746000410427,\n\
\ \"acc_norm_stderr\": 0.03318873637542751,\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.37575591444529527,\n\
\ \"mc2_stderr\": 0.014304714495441502\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4598976109215017,\n \"acc_stderr\": 0.014564318856924848,\n\
\ \"acc_norm\": 0.4948805460750853,\n \"acc_norm_stderr\": 0.014610624890309157\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5965943039235212,\n\
\ \"acc_stderr\": 0.004895782107786499,\n \"acc_norm\": 0.7855008962358097,\n\
\ \"acc_norm_stderr\": 0.0040963551251175165\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.3111111111111111,\n\
\ \"acc_stderr\": 0.039992628766177214,\n \"acc_norm\": 0.3111111111111111,\n\
\ \"acc_norm_stderr\": 0.039992628766177214\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.2894736842105263,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.2894736842105263,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3433962264150943,\n \"acc_stderr\": 0.02922452646912479,\n\
\ \"acc_norm\": 0.3433962264150943,\n \"acc_norm_stderr\": 0.02922452646912479\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2847222222222222,\n\
\ \"acc_stderr\": 0.037738099906869355,\n \"acc_norm\": 0.2847222222222222,\n\
\ \"acc_norm_stderr\": 0.037738099906869355\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.11,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.11,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3276595744680851,\n \"acc_stderr\": 0.030683020843231004,\n\
\ \"acc_norm\": 0.3276595744680851,\n \"acc_norm_stderr\": 0.030683020843231004\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.0383515395439942,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.0383515395439942\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.034550710191021496,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.034550710191021496\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3161290322580645,\n \"acc_stderr\": 0.026450874489042774,\n \"\
acc_norm\": 0.3161290322580645,\n \"acc_norm_stderr\": 0.026450874489042774\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.33004926108374383,\n \"acc_stderr\": 0.033085304262282574,\n \"\
acc_norm\": 0.33004926108374383,\n \"acc_norm_stderr\": 0.033085304262282574\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.38181818181818183,\n \"acc_stderr\": 0.03793713171165634,\n\
\ \"acc_norm\": 0.38181818181818183,\n \"acc_norm_stderr\": 0.03793713171165634\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3383838383838384,\n \"acc_stderr\": 0.03371124142626302,\n \"\
acc_norm\": 0.3383838383838384,\n \"acc_norm_stderr\": 0.03371124142626302\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.27979274611398963,\n \"acc_stderr\": 0.032396370467357036,\n\
\ \"acc_norm\": 0.27979274611398963,\n \"acc_norm_stderr\": 0.032396370467357036\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2641025641025641,\n \"acc_stderr\": 0.02235219373745327,\n \
\ \"acc_norm\": 0.2641025641025641,\n \"acc_norm_stderr\": 0.02235219373745327\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.28991596638655465,\n \"acc_stderr\": 0.029472485833136084,\n\
\ \"acc_norm\": 0.28991596638655465,\n \"acc_norm_stderr\": 0.029472485833136084\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31125827814569534,\n \"acc_stderr\": 0.03780445850526732,\n \"\
acc_norm\": 0.31125827814569534,\n \"acc_norm_stderr\": 0.03780445850526732\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.29908256880733947,\n \"acc_stderr\": 0.019630417285415175,\n \"\
acc_norm\": 0.29908256880733947,\n \"acc_norm_stderr\": 0.019630417285415175\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.22685185185185186,\n \"acc_stderr\": 0.028561650102422263,\n \"\
acc_norm\": 0.22685185185185186,\n \"acc_norm_stderr\": 0.028561650102422263\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.28431372549019607,\n \"acc_stderr\": 0.031660096793998116,\n \"\
acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.031660096793998116\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29535864978902954,\n \"acc_stderr\": 0.029696338713422876,\n \
\ \"acc_norm\": 0.29535864978902954,\n \"acc_norm_stderr\": 0.029696338713422876\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3901345291479821,\n\
\ \"acc_stderr\": 0.03273766725459157,\n \"acc_norm\": 0.3901345291479821,\n\
\ \"acc_norm_stderr\": 0.03273766725459157\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3282442748091603,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.3282442748091603,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.371900826446281,\n \"acc_stderr\": 0.044120158066245044,\n \"\
acc_norm\": 0.371900826446281,\n \"acc_norm_stderr\": 0.044120158066245044\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.32407407407407407,\n\
\ \"acc_stderr\": 0.04524596007030048,\n \"acc_norm\": 0.32407407407407407,\n\
\ \"acc_norm_stderr\": 0.04524596007030048\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2883435582822086,\n \"acc_stderr\": 0.035590395316173425,\n\
\ \"acc_norm\": 0.2883435582822086,\n \"acc_norm_stderr\": 0.035590395316173425\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2857142857142857,\n\
\ \"acc_stderr\": 0.04287858751340456,\n \"acc_norm\": 0.2857142857142857,\n\
\ \"acc_norm_stderr\": 0.04287858751340456\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.3504273504273504,\n\
\ \"acc_stderr\": 0.03125610824421879,\n \"acc_norm\": 0.3504273504273504,\n\
\ \"acc_norm_stderr\": 0.03125610824421879\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.37292464878671777,\n\
\ \"acc_stderr\": 0.01729286826945392,\n \"acc_norm\": 0.37292464878671777,\n\
\ \"acc_norm_stderr\": 0.01729286826945392\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.3670520231213873,\n \"acc_stderr\": 0.025950054337654082,\n\
\ \"acc_norm\": 0.3670520231213873,\n \"acc_norm_stderr\": 0.025950054337654082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925303,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925303\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3504823151125402,\n\
\ \"acc_stderr\": 0.027098652621301754,\n \"acc_norm\": 0.3504823151125402,\n\
\ \"acc_norm_stderr\": 0.027098652621301754\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.3148148148148148,\n \"acc_stderr\": 0.025842248700902168,\n\
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.025842248700902168\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2801418439716312,\n \"acc_stderr\": 0.026789172351140242,\n \
\ \"acc_norm\": 0.2801418439716312,\n \"acc_norm_stderr\": 0.026789172351140242\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2737940026075619,\n\
\ \"acc_stderr\": 0.011388612167979395,\n \"acc_norm\": 0.2737940026075619,\n\
\ \"acc_norm_stderr\": 0.011388612167979395\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.26838235294117646,\n \"acc_stderr\": 0.026917481224377232,\n\
\ \"acc_norm\": 0.26838235294117646,\n \"acc_norm_stderr\": 0.026917481224377232\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.29901960784313725,\n \"acc_stderr\": 0.018521756215423024,\n \
\ \"acc_norm\": 0.29901960784313725,\n \"acc_norm_stderr\": 0.018521756215423024\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.33636363636363636,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.33636363636363636,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.3183673469387755,\n \"acc_stderr\": 0.029822533793982076,\n\
\ \"acc_norm\": 0.3183673469387755,\n \"acc_norm_stderr\": 0.029822533793982076\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3880597014925373,\n\
\ \"acc_stderr\": 0.0344578996436275,\n \"acc_norm\": 0.3880597014925373,\n\
\ \"acc_norm_stderr\": 0.0344578996436275\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3614457831325301,\n\
\ \"acc_stderr\": 0.037400593820293204,\n \"acc_norm\": 0.3614457831325301,\n\
\ \"acc_norm_stderr\": 0.037400593820293204\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3508771929824561,\n \"acc_stderr\": 0.03660298834049162,\n\
\ \"acc_norm\": 0.3508771929824561,\n \"acc_norm_stderr\": 0.03660298834049162\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2582619339045288,\n\
\ \"mc1_stderr\": 0.015321821688476196,\n \"mc2\": 0.37575591444529527,\n\
\ \"mc2_stderr\": 0.014304714495441502\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7048145224940805,\n \"acc_stderr\": 0.012819410741754763\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.06974981046247157,\n \
\ \"acc_stderr\": 0.007016389571013849\n }\n}\n```"
repo_url: https://huggingface.co/jisukim8873/falcon-7B-case-8
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-50-47.085955.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-50-47.085955.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- '**/details_harness|winogrande|5_2024-03-09T20-50-47.085955.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-50-47.085955.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_50_47.085955
path:
- results_2024-03-09T20-50-47.085955.parquet
- split: latest
path:
- results_2024-03-09T20-50-47.085955.parquet
---
# Dataset Card for Evaluation run of jisukim8873/falcon-7B-case-8
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [jisukim8873/falcon-7B-case-8](https://huggingface.co/jisukim8873/falcon-7B-case-8) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_jisukim8873__falcon-7B-case-8",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:50:47.085955](https://huggingface.co/datasets/open-llm-leaderboard/details_jisukim8873__falcon-7B-case-8/blob/main/results_2024-03-09T20-50-47.085955.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.31319141324446226,
"acc_stderr": 0.032438104383807814,
"acc_norm": 0.3144746000410427,
"acc_norm_stderr": 0.03318873637542751,
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476196,
"mc2": 0.37575591444529527,
"mc2_stderr": 0.014304714495441502
},
"harness|arc:challenge|25": {
"acc": 0.4598976109215017,
"acc_stderr": 0.014564318856924848,
"acc_norm": 0.4948805460750853,
"acc_norm_stderr": 0.014610624890309157
},
"harness|hellaswag|10": {
"acc": 0.5965943039235212,
"acc_stderr": 0.004895782107786499,
"acc_norm": 0.7855008962358097,
"acc_norm_stderr": 0.0040963551251175165
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.039992628766177214,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.039992628766177214
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3433962264150943,
"acc_stderr": 0.02922452646912479,
"acc_norm": 0.3433962264150943,
"acc_norm_stderr": 0.02922452646912479
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2847222222222222,
"acc_stderr": 0.037738099906869355,
"acc_norm": 0.2847222222222222,
"acc_norm_stderr": 0.037738099906869355
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.11,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.11,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3276595744680851,
"acc_stderr": 0.030683020843231004,
"acc_norm": 0.3276595744680851,
"acc_norm_stderr": 0.030683020843231004
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.0383515395439942,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.0383515395439942
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.034550710191021496,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.034550710191021496
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3161290322580645,
"acc_stderr": 0.026450874489042774,
"acc_norm": 0.3161290322580645,
"acc_norm_stderr": 0.026450874489042774
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.33004926108374383,
"acc_stderr": 0.033085304262282574,
"acc_norm": 0.33004926108374383,
"acc_norm_stderr": 0.033085304262282574
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.03793713171165634,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.03793713171165634
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3383838383838384,
"acc_stderr": 0.03371124142626302,
"acc_norm": 0.3383838383838384,
"acc_norm_stderr": 0.03371124142626302
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.27979274611398963,
"acc_stderr": 0.032396370467357036,
"acc_norm": 0.27979274611398963,
"acc_norm_stderr": 0.032396370467357036
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2641025641025641,
"acc_stderr": 0.02235219373745327,
"acc_norm": 0.2641025641025641,
"acc_norm_stderr": 0.02235219373745327
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.28991596638655465,
"acc_stderr": 0.029472485833136084,
"acc_norm": 0.28991596638655465,
"acc_norm_stderr": 0.029472485833136084
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31125827814569534,
"acc_stderr": 0.03780445850526732,
"acc_norm": 0.31125827814569534,
"acc_norm_stderr": 0.03780445850526732
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.29908256880733947,
"acc_stderr": 0.019630417285415175,
"acc_norm": 0.29908256880733947,
"acc_norm_stderr": 0.019630417285415175
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.22685185185185186,
"acc_stderr": 0.028561650102422263,
"acc_norm": 0.22685185185185186,
"acc_norm_stderr": 0.028561650102422263
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.031660096793998116,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.031660096793998116
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29535864978902954,
"acc_stderr": 0.029696338713422876,
"acc_norm": 0.29535864978902954,
"acc_norm_stderr": 0.029696338713422876
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3901345291479821,
"acc_stderr": 0.03273766725459157,
"acc_norm": 0.3901345291479821,
"acc_norm_stderr": 0.03273766725459157
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3282442748091603,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.3282442748091603,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.371900826446281,
"acc_stderr": 0.044120158066245044,
"acc_norm": 0.371900826446281,
"acc_norm_stderr": 0.044120158066245044
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.32407407407407407,
"acc_stderr": 0.04524596007030048,
"acc_norm": 0.32407407407407407,
"acc_norm_stderr": 0.04524596007030048
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2883435582822086,
"acc_stderr": 0.035590395316173425,
"acc_norm": 0.2883435582822086,
"acc_norm_stderr": 0.035590395316173425
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.04287858751340456,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.04287858751340456
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.3504273504273504,
"acc_stderr": 0.03125610824421879,
"acc_norm": 0.3504273504273504,
"acc_norm_stderr": 0.03125610824421879
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.37292464878671777,
"acc_stderr": 0.01729286826945392,
"acc_norm": 0.37292464878671777,
"acc_norm_stderr": 0.01729286826945392
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.3670520231213873,
"acc_stderr": 0.025950054337654082,
"acc_norm": 0.3670520231213873,
"acc_norm_stderr": 0.025950054337654082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925303,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925303
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3504823151125402,
"acc_stderr": 0.027098652621301754,
"acc_norm": 0.3504823151125402,
"acc_norm_stderr": 0.027098652621301754
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.025842248700902168,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.025842248700902168
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2801418439716312,
"acc_stderr": 0.026789172351140242,
"acc_norm": 0.2801418439716312,
"acc_norm_stderr": 0.026789172351140242
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2737940026075619,
"acc_stderr": 0.011388612167979395,
"acc_norm": 0.2737940026075619,
"acc_norm_stderr": 0.011388612167979395
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.26838235294117646,
"acc_stderr": 0.026917481224377232,
"acc_norm": 0.26838235294117646,
"acc_norm_stderr": 0.026917481224377232
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.29901960784313725,
"acc_stderr": 0.018521756215423024,
"acc_norm": 0.29901960784313725,
"acc_norm_stderr": 0.018521756215423024
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.33636363636363636,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.33636363636363636,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.3183673469387755,
"acc_stderr": 0.029822533793982076,
"acc_norm": 0.3183673469387755,
"acc_norm_stderr": 0.029822533793982076
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3880597014925373,
"acc_stderr": 0.0344578996436275,
"acc_norm": 0.3880597014925373,
"acc_norm_stderr": 0.0344578996436275
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3614457831325301,
"acc_stderr": 0.037400593820293204,
"acc_norm": 0.3614457831325301,
"acc_norm_stderr": 0.037400593820293204
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3508771929824561,
"acc_stderr": 0.03660298834049162,
"acc_norm": 0.3508771929824561,
"acc_norm_stderr": 0.03660298834049162
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2582619339045288,
"mc1_stderr": 0.015321821688476196,
"mc2": 0.37575591444529527,
"mc2_stderr": 0.014304714495441502
},
"harness|winogrande|5": {
"acc": 0.7048145224940805,
"acc_stderr": 0.012819410741754763
},
"harness|gsm8k|5": {
"acc": 0.06974981046247157,
"acc_stderr": 0.007016389571013849
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_OEvortex__vortex-3b | open-llm-leaderboard-old | "2024-03-09T20:59:24Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T20:59:01Z" | ---
pretty_name: Evaluation run of OEvortex/vortex-3b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OEvortex/vortex-3b](https://huggingface.co/OEvortex/vortex-3b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OEvortex__vortex-3b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:57:20.886463](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__vortex-3b/blob/main/results_2024-03-09T20-57-20.886463.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.27710891940955856,\n\
\ \"acc_stderr\": 0.03153668377872444,\n \"acc_norm\": 0.27903126889601115,\n\
\ \"acc_norm_stderr\": 0.03233434313923451,\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602576,\n \"mc2\": 0.37387212007515463,\n\
\ \"mc2_stderr\": 0.013976985373306746\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2901023890784983,\n \"acc_stderr\": 0.013261573677520774,\n\
\ \"acc_norm\": 0.3191126279863481,\n \"acc_norm_stderr\": 0.013621696119173304\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4282015534754033,\n\
\ \"acc_stderr\": 0.004938068627349492,\n \"acc_norm\": 0.5689105755825533,\n\
\ \"acc_norm_stderr\": 0.004942164585991472\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2962962962962963,\n\
\ \"acc_stderr\": 0.03944624162501116,\n \"acc_norm\": 0.2962962962962963,\n\
\ \"acc_norm_stderr\": 0.03944624162501116\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.031975658210324984,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.031975658210324984\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2792452830188679,\n \"acc_stderr\": 0.027611163402399715,\n\
\ \"acc_norm\": 0.2792452830188679,\n \"acc_norm_stderr\": 0.027611163402399715\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621503\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993178,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993178\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.31,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2553191489361702,\n \"acc_stderr\": 0.02850485647051419,\n\
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.02850485647051419\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2698412698412698,\n \"acc_stderr\": 0.022860838309232072,\n \"\
acc_norm\": 0.2698412698412698,\n \"acc_norm_stderr\": 0.022860838309232072\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.16666666666666666,\n\
\ \"acc_stderr\": 0.03333333333333337,\n \"acc_norm\": 0.16666666666666666,\n\
\ \"acc_norm_stderr\": 0.03333333333333337\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2709677419354839,\n\
\ \"acc_stderr\": 0.02528441611490016,\n \"acc_norm\": 0.2709677419354839,\n\
\ \"acc_norm_stderr\": 0.02528441611490016\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.031447125816782426,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.031447125816782426\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2606060606060606,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.2606060606060606,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2676767676767677,\n \"acc_stderr\": 0.03154449888270286,\n \"\
acc_norm\": 0.2676767676767677,\n \"acc_norm_stderr\": 0.03154449888270286\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.3160621761658031,\n \"acc_stderr\": 0.03355397369686173,\n\
\ \"acc_norm\": 0.3160621761658031,\n \"acc_norm_stderr\": 0.03355397369686173\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3384615384615385,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.3384615384615385,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.25630252100840334,\n \"acc_stderr\": 0.02835962087053395,\n\
\ \"acc_norm\": 0.25630252100840334,\n \"acc_norm_stderr\": 0.02835962087053395\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.03543304234389985,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.03543304234389985\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22385321100917432,\n \"acc_stderr\": 0.01787121776779022,\n \"\
acc_norm\": 0.22385321100917432,\n \"acc_norm_stderr\": 0.01787121776779022\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.2107843137254902,\n\
\ \"acc_stderr\": 0.02862654791243739,\n \"acc_norm\": 0.2107843137254902,\n\
\ \"acc_norm_stderr\": 0.02862654791243739\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.28270042194092826,\n \"acc_stderr\": 0.02931281415395593,\n\
\ \"acc_norm\": 0.28270042194092826,\n \"acc_norm_stderr\": 0.02931281415395593\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3721973094170404,\n\
\ \"acc_stderr\": 0.03244305283008731,\n \"acc_norm\": 0.3721973094170404,\n\
\ \"acc_norm_stderr\": 0.03244305283008731\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462202,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462202\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.34710743801652894,\n \"acc_stderr\": 0.043457245702925335,\n \"\
acc_norm\": 0.34710743801652894,\n \"acc_norm_stderr\": 0.043457245702925335\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.21296296296296297,\n\
\ \"acc_stderr\": 0.03957835471980981,\n \"acc_norm\": 0.21296296296296297,\n\
\ \"acc_norm_stderr\": 0.03957835471980981\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25153374233128833,\n \"acc_stderr\": 0.034089978868575295,\n\
\ \"acc_norm\": 0.25153374233128833,\n \"acc_norm_stderr\": 0.034089978868575295\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04547960999764376,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04547960999764376\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.045821241601615506,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.045821241601615506\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2564102564102564,\n\
\ \"acc_stderr\": 0.028605953702004253,\n \"acc_norm\": 0.2564102564102564,\n\
\ \"acc_norm_stderr\": 0.028605953702004253\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2656449553001277,\n\
\ \"acc_stderr\": 0.015794302487888722,\n \"acc_norm\": 0.2656449553001277,\n\
\ \"acc_norm_stderr\": 0.015794302487888722\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2658959537572254,\n \"acc_stderr\": 0.023786203255508287,\n\
\ \"acc_norm\": 0.2658959537572254,\n \"acc_norm_stderr\": 0.023786203255508287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24134078212290502,\n\
\ \"acc_stderr\": 0.014310999547961455,\n \"acc_norm\": 0.24134078212290502,\n\
\ \"acc_norm_stderr\": 0.014310999547961455\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2797427652733119,\n\
\ \"acc_stderr\": 0.02549425935069491,\n \"acc_norm\": 0.2797427652733119,\n\
\ \"acc_norm_stderr\": 0.02549425935069491\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2624113475177305,\n \"acc_stderr\": 0.026244920349843014,\n \
\ \"acc_norm\": 0.2624113475177305,\n \"acc_norm_stderr\": 0.026244920349843014\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.26401564537157757,\n\
\ \"acc_stderr\": 0.011258435537723831,\n \"acc_norm\": 0.26401564537157757,\n\
\ \"acc_norm_stderr\": 0.011258435537723831\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.35661764705882354,\n \"acc_stderr\": 0.02909720956841195,\n\
\ \"acc_norm\": 0.35661764705882354,\n \"acc_norm_stderr\": 0.02909720956841195\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.238562091503268,\n \"acc_stderr\": 0.017242385828779606,\n \
\ \"acc_norm\": 0.238562091503268,\n \"acc_norm_stderr\": 0.017242385828779606\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.34545454545454546,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.34545454545454546,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.31020408163265306,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.31020408163265306,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22388059701492538,\n\
\ \"acc_stderr\": 0.02947525023601719,\n \"acc_norm\": 0.22388059701492538,\n\
\ \"acc_norm_stderr\": 0.02947525023601719\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.29239766081871343,\n \"acc_stderr\": 0.034886477134579215,\n\
\ \"acc_norm\": 0.29239766081871343,\n \"acc_norm_stderr\": 0.034886477134579215\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22643818849449204,\n\
\ \"mc1_stderr\": 0.014651337324602576,\n \"mc2\": 0.37387212007515463,\n\
\ \"mc2_stderr\": 0.013976985373306746\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.601420678768745,\n \"acc_stderr\": 0.013760357176873834\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.009097801364670205,\n \
\ \"acc_stderr\": 0.002615326510775672\n }\n}\n```"
repo_url: https://huggingface.co/OEvortex/vortex-3b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-57-20.886463.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-57-20.886463.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- '**/details_harness|winogrande|5_2024-03-09T20-57-20.886463.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-57-20.886463.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_57_20.886463
path:
- results_2024-03-09T20-57-20.886463.parquet
- split: latest
path:
- results_2024-03-09T20-57-20.886463.parquet
---
# Dataset Card for Evaluation run of OEvortex/vortex-3b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OEvortex/vortex-3b](https://huggingface.co/OEvortex/vortex-3b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OEvortex__vortex-3b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:57:20.886463](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__vortex-3b/blob/main/results_2024-03-09T20-57-20.886463.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.27710891940955856,
"acc_stderr": 0.03153668377872444,
"acc_norm": 0.27903126889601115,
"acc_norm_stderr": 0.03233434313923451,
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602576,
"mc2": 0.37387212007515463,
"mc2_stderr": 0.013976985373306746
},
"harness|arc:challenge|25": {
"acc": 0.2901023890784983,
"acc_stderr": 0.013261573677520774,
"acc_norm": 0.3191126279863481,
"acc_norm_stderr": 0.013621696119173304
},
"harness|hellaswag|10": {
"acc": 0.4282015534754033,
"acc_stderr": 0.004938068627349492,
"acc_norm": 0.5689105755825533,
"acc_norm_stderr": 0.004942164585991472
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.03944624162501116,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.03944624162501116
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.031975658210324984,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.031975658210324984
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2792452830188679,
"acc_stderr": 0.027611163402399715,
"acc_norm": 0.2792452830188679,
"acc_norm_stderr": 0.027611163402399715
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.25,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993178,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993178
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.02850485647051419,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.02850485647051419
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2698412698412698,
"acc_stderr": 0.022860838309232072,
"acc_norm": 0.2698412698412698,
"acc_norm_stderr": 0.022860838309232072
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.16666666666666666,
"acc_stderr": 0.03333333333333337,
"acc_norm": 0.16666666666666666,
"acc_norm_stderr": 0.03333333333333337
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2709677419354839,
"acc_stderr": 0.02528441611490016,
"acc_norm": 0.2709677419354839,
"acc_norm_stderr": 0.02528441611490016
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.031447125816782426,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.031447125816782426
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2606060606060606,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.2606060606060606,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2676767676767677,
"acc_stderr": 0.03154449888270286,
"acc_norm": 0.2676767676767677,
"acc_norm_stderr": 0.03154449888270286
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.3160621761658031,
"acc_stderr": 0.03355397369686173,
"acc_norm": 0.3160621761658031,
"acc_norm_stderr": 0.03355397369686173
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3384615384615385,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.3384615384615385,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.25630252100840334,
"acc_stderr": 0.02835962087053395,
"acc_norm": 0.25630252100840334,
"acc_norm_stderr": 0.02835962087053395
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.03543304234389985,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.03543304234389985
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22385321100917432,
"acc_stderr": 0.01787121776779022,
"acc_norm": 0.22385321100917432,
"acc_norm_stderr": 0.01787121776779022
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2107843137254902,
"acc_stderr": 0.02862654791243739,
"acc_norm": 0.2107843137254902,
"acc_norm_stderr": 0.02862654791243739
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.28270042194092826,
"acc_stderr": 0.02931281415395593,
"acc_norm": 0.28270042194092826,
"acc_norm_stderr": 0.02931281415395593
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3721973094170404,
"acc_stderr": 0.03244305283008731,
"acc_norm": 0.3721973094170404,
"acc_norm_stderr": 0.03244305283008731
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462202,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462202
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.34710743801652894,
"acc_stderr": 0.043457245702925335,
"acc_norm": 0.34710743801652894,
"acc_norm_stderr": 0.043457245702925335
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.21296296296296297,
"acc_stderr": 0.03957835471980981,
"acc_norm": 0.21296296296296297,
"acc_norm_stderr": 0.03957835471980981
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25153374233128833,
"acc_stderr": 0.034089978868575295,
"acc_norm": 0.25153374233128833,
"acc_norm_stderr": 0.034089978868575295
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04547960999764376,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04547960999764376
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.045821241601615506,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.045821241601615506
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.028605953702004253,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.028605953702004253
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2656449553001277,
"acc_stderr": 0.015794302487888722,
"acc_norm": 0.2656449553001277,
"acc_norm_stderr": 0.015794302487888722
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.023786203255508287,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.023786203255508287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24134078212290502,
"acc_stderr": 0.014310999547961455,
"acc_norm": 0.24134078212290502,
"acc_norm_stderr": 0.014310999547961455
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2797427652733119,
"acc_stderr": 0.02549425935069491,
"acc_norm": 0.2797427652733119,
"acc_norm_stderr": 0.02549425935069491
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2624113475177305,
"acc_stderr": 0.026244920349843014,
"acc_norm": 0.2624113475177305,
"acc_norm_stderr": 0.026244920349843014
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.26401564537157757,
"acc_stderr": 0.011258435537723831,
"acc_norm": 0.26401564537157757,
"acc_norm_stderr": 0.011258435537723831
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.35661764705882354,
"acc_stderr": 0.02909720956841195,
"acc_norm": 0.35661764705882354,
"acc_norm_stderr": 0.02909720956841195
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.238562091503268,
"acc_stderr": 0.017242385828779606,
"acc_norm": 0.238562091503268,
"acc_norm_stderr": 0.017242385828779606
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.34545454545454546,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.34545454545454546,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.31020408163265306,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.31020408163265306,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22388059701492538,
"acc_stderr": 0.02947525023601719,
"acc_norm": 0.22388059701492538,
"acc_norm_stderr": 0.02947525023601719
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.29239766081871343,
"acc_stderr": 0.034886477134579215,
"acc_norm": 0.29239766081871343,
"acc_norm_stderr": 0.034886477134579215
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22643818849449204,
"mc1_stderr": 0.014651337324602576,
"mc2": 0.37387212007515463,
"mc2_stderr": 0.013976985373306746
},
"harness|winogrande|5": {
"acc": 0.601420678768745,
"acc_stderr": 0.013760357176873834
},
"harness|gsm8k|5": {
"acc": 0.009097801364670205,
"acc_stderr": 0.002615326510775672
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B | open-llm-leaderboard-old | "2024-03-09T21:00:30Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T21:00:10Z" | ---
pretty_name: Evaluation run of h4rz3rk4s3/TinyPoliticaLlama-1.1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h4rz3rk4s3/TinyPoliticaLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyPoliticaLlama-1.1B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T20:58:23.188763](https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B/blob/main/results_2024-03-09T20-58-23.188763.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2592181333001868,\n\
\ \"acc_stderr\": 0.030987518923392604,\n \"acc_norm\": 0.26142713275875756,\n\
\ \"acc_norm_stderr\": 0.031810688522865525,\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427688,\n \"mc2\": 0.3805985213573371,\n\
\ \"mc2_stderr\": 0.01395025708087029\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.295221843003413,\n \"acc_stderr\": 0.013329750293382316,\n\
\ \"acc_norm\": 0.3378839590443686,\n \"acc_norm_stderr\": 0.013822047922283514\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4320852419836686,\n\
\ \"acc_stderr\": 0.0049435372423444176,\n \"acc_norm\": 0.5782712607050389,\n\
\ \"acc_norm_stderr\": 0.004928263494616739\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17777777777777778,\n\
\ \"acc_stderr\": 0.033027898599017176,\n \"acc_norm\": 0.17777777777777778,\n\
\ \"acc_norm_stderr\": 0.033027898599017176\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123408,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123408\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.02674989977124123,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.02674989977124123\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2777777777777778,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.2777777777777778,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2543352601156069,\n\
\ \"acc_stderr\": 0.0332055644308557,\n \"acc_norm\": 0.2543352601156069,\n\
\ \"acc_norm_stderr\": 0.0332055644308557\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.27450980392156865,\n \"acc_stderr\": 0.044405219061793275,\n\
\ \"acc_norm\": 0.27450980392156865,\n \"acc_norm_stderr\": 0.044405219061793275\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n\
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.026148818018424495,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.026148818018424495\n \
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.20175438596491227,\n\
\ \"acc_stderr\": 0.037752050135836386,\n \"acc_norm\": 0.20175438596491227,\n\
\ \"acc_norm_stderr\": 0.037752050135836386\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.24867724867724866,\n \"acc_stderr\": 0.022261817692400168,\n \"\
acc_norm\": 0.24867724867724866,\n \"acc_norm_stderr\": 0.022261817692400168\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.1935483870967742,\n\
\ \"acc_stderr\": 0.02247525852553606,\n \"acc_norm\": 0.1935483870967742,\n\
\ \"acc_norm_stderr\": 0.02247525852553606\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.18719211822660098,\n \"acc_stderr\": 0.027444924966882618,\n\
\ \"acc_norm\": 0.18719211822660098,\n \"acc_norm_stderr\": 0.027444924966882618\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.18686868686868688,\n \"acc_stderr\": 0.027772533334218977,\n \"\
acc_norm\": 0.18686868686868688,\n \"acc_norm_stderr\": 0.027772533334218977\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860657,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860657\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2948717948717949,\n \"acc_stderr\": 0.02311936275823229,\n \
\ \"acc_norm\": 0.2948717948717949,\n \"acc_norm_stderr\": 0.02311936275823229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2018348623853211,\n \"acc_stderr\": 0.01720857935778757,\n \"\
acc_norm\": 0.2018348623853211,\n \"acc_norm_stderr\": 0.01720857935778757\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.28703703703703703,\n \"acc_stderr\": 0.030851992993257017,\n \"\
acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.030851992993257017\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.270042194092827,\n \"acc_stderr\": 0.028900721906293426,\n\
\ \"acc_norm\": 0.270042194092827,\n \"acc_norm_stderr\": 0.028900721906293426\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3452914798206278,\n\
\ \"acc_stderr\": 0.03191100192835794,\n \"acc_norm\": 0.3452914798206278,\n\
\ \"acc_norm_stderr\": 0.03191100192835794\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.29770992366412213,\n \"acc_stderr\": 0.04010358942462203,\n\
\ \"acc_norm\": 0.29770992366412213,\n \"acc_norm_stderr\": 0.04010358942462203\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1941747572815534,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.1941747572815534,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2863247863247863,\n\
\ \"acc_stderr\": 0.02961432369045665,\n \"acc_norm\": 0.2863247863247863,\n\
\ \"acc_norm_stderr\": 0.02961432369045665\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24904214559386972,\n\
\ \"acc_stderr\": 0.015464676163395977,\n \"acc_norm\": 0.24904214559386972,\n\
\ \"acc_norm_stderr\": 0.015464676163395977\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.02335736578587404,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.02335736578587404\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808835,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808835\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.023929155517351284,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.023929155517351284\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.24115755627009647,\n\
\ \"acc_stderr\": 0.024296594034763426,\n \"acc_norm\": 0.24115755627009647,\n\
\ \"acc_norm_stderr\": 0.024296594034763426\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2345679012345679,\n \"acc_stderr\": 0.023576881744005716,\n\
\ \"acc_norm\": 0.2345679012345679,\n \"acc_norm_stderr\": 0.023576881744005716\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.21631205673758866,\n \"acc_stderr\": 0.024561720560562796,\n \
\ \"acc_norm\": 0.21631205673758866,\n \"acc_norm_stderr\": 0.024561720560562796\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2470664928292047,\n\
\ \"acc_stderr\": 0.011015752255279341,\n \"acc_norm\": 0.2470664928292047,\n\
\ \"acc_norm_stderr\": 0.011015752255279341\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20220588235294118,\n \"acc_stderr\": 0.024398192986654924,\n\
\ \"acc_norm\": 0.20220588235294118,\n \"acc_norm_stderr\": 0.024398192986654924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.23529411764705882,\n \"acc_stderr\": 0.01716058723504634,\n \
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.01716058723504634\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.21818181818181817,\n\
\ \"acc_stderr\": 0.03955932861795833,\n \"acc_norm\": 0.21818181818181817,\n\
\ \"acc_norm_stderr\": 0.03955932861795833\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.19591836734693877,\n \"acc_stderr\": 0.025409301953225678,\n\
\ \"acc_norm\": 0.19591836734693877,\n \"acc_norm_stderr\": 0.025409301953225678\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.27860696517412936,\n\
\ \"acc_stderr\": 0.031700561834973086,\n \"acc_norm\": 0.27860696517412936,\n\
\ \"acc_norm_stderr\": 0.031700561834973086\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.29518072289156627,\n\
\ \"acc_stderr\": 0.035509201856896294,\n \"acc_norm\": 0.29518072289156627,\n\
\ \"acc_norm_stderr\": 0.035509201856896294\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21542227662178703,\n\
\ \"mc1_stderr\": 0.014391902652427688,\n \"mc2\": 0.3805985213573371,\n\
\ \"mc2_stderr\": 0.01395025708087029\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5769534333070244,\n \"acc_stderr\": 0.013885055359056476\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/h4rz3rk4s3/TinyPoliticaLlama-1.1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T20-58-23.188763.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- '**/details_harness|winogrande|5_2024-03-09T20-58-23.188763.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T20-58-23.188763.parquet'
- config_name: results
data_files:
- split: 2024_03_09T20_58_23.188763
path:
- results_2024-03-09T20-58-23.188763.parquet
- split: latest
path:
- results_2024-03-09T20-58-23.188763.parquet
---
# Dataset Card for Evaluation run of h4rz3rk4s3/TinyPoliticaLlama-1.1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [h4rz3rk4s3/TinyPoliticaLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyPoliticaLlama-1.1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T20:58:23.188763](https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyPoliticaLlama-1.1B/blob/main/results_2024-03-09T20-58-23.188763.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2592181333001868,
"acc_stderr": 0.030987518923392604,
"acc_norm": 0.26142713275875756,
"acc_norm_stderr": 0.031810688522865525,
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427688,
"mc2": 0.3805985213573371,
"mc2_stderr": 0.01395025708087029
},
"harness|arc:challenge|25": {
"acc": 0.295221843003413,
"acc_stderr": 0.013329750293382316,
"acc_norm": 0.3378839590443686,
"acc_norm_stderr": 0.013822047922283514
},
"harness|hellaswag|10": {
"acc": 0.4320852419836686,
"acc_stderr": 0.0049435372423444176,
"acc_norm": 0.5782712607050389,
"acc_norm_stderr": 0.004928263494616739
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17777777777777778,
"acc_stderr": 0.033027898599017176,
"acc_norm": 0.17777777777777778,
"acc_norm_stderr": 0.033027898599017176
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123408,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123408
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.02674989977124123,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.02674989977124123
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2543352601156069,
"acc_stderr": 0.0332055644308557,
"acc_norm": 0.2543352601156069,
"acc_norm_stderr": 0.0332055644308557
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.27450980392156865,
"acc_stderr": 0.044405219061793275,
"acc_norm": 0.27450980392156865,
"acc_norm_stderr": 0.044405219061793275
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2,
"acc_stderr": 0.026148818018424495,
"acc_norm": 0.2,
"acc_norm_stderr": 0.026148818018424495
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.20175438596491227,
"acc_stderr": 0.037752050135836386,
"acc_norm": 0.20175438596491227,
"acc_norm_stderr": 0.037752050135836386
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.24867724867724866,
"acc_stderr": 0.022261817692400168,
"acc_norm": 0.24867724867724866,
"acc_norm_stderr": 0.022261817692400168
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.1935483870967742,
"acc_stderr": 0.02247525852553606,
"acc_norm": 0.1935483870967742,
"acc_norm_stderr": 0.02247525852553606
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.18719211822660098,
"acc_stderr": 0.027444924966882618,
"acc_norm": 0.18719211822660098,
"acc_norm_stderr": 0.027444924966882618
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.18686868686868688,
"acc_stderr": 0.027772533334218977,
"acc_norm": 0.18686868686868688,
"acc_norm_stderr": 0.027772533334218977
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860657,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860657
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.02311936275823229,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.02311936275823229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2018348623853211,
"acc_stderr": 0.01720857935778757,
"acc_norm": 0.2018348623853211,
"acc_norm_stderr": 0.01720857935778757
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.030851992993257017,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.030851992993257017
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.270042194092827,
"acc_stderr": 0.028900721906293426,
"acc_norm": 0.270042194092827,
"acc_norm_stderr": 0.028900721906293426
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3452914798206278,
"acc_stderr": 0.03191100192835794,
"acc_norm": 0.3452914798206278,
"acc_norm_stderr": 0.03191100192835794
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.29770992366412213,
"acc_stderr": 0.04010358942462203,
"acc_norm": 0.29770992366412213,
"acc_norm_stderr": 0.04010358942462203
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.1941747572815534,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.1941747572815534,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2863247863247863,
"acc_stderr": 0.02961432369045665,
"acc_norm": 0.2863247863247863,
"acc_norm_stderr": 0.02961432369045665
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24904214559386972,
"acc_stderr": 0.015464676163395977,
"acc_norm": 0.24904214559386972,
"acc_norm_stderr": 0.015464676163395977
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.02335736578587404,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.02335736578587404
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808835,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808835
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.023929155517351284,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.023929155517351284
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.24115755627009647,
"acc_stderr": 0.024296594034763426,
"acc_norm": 0.24115755627009647,
"acc_norm_stderr": 0.024296594034763426
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2345679012345679,
"acc_stderr": 0.023576881744005716,
"acc_norm": 0.2345679012345679,
"acc_norm_stderr": 0.023576881744005716
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.21631205673758866,
"acc_stderr": 0.024561720560562796,
"acc_norm": 0.21631205673758866,
"acc_norm_stderr": 0.024561720560562796
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2470664928292047,
"acc_stderr": 0.011015752255279341,
"acc_norm": 0.2470664928292047,
"acc_norm_stderr": 0.011015752255279341
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20220588235294118,
"acc_stderr": 0.024398192986654924,
"acc_norm": 0.20220588235294118,
"acc_norm_stderr": 0.024398192986654924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.01716058723504634,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.01716058723504634
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03955932861795833,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03955932861795833
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19591836734693877,
"acc_stderr": 0.025409301953225678,
"acc_norm": 0.19591836734693877,
"acc_norm_stderr": 0.025409301953225678
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.27860696517412936,
"acc_stderr": 0.031700561834973086,
"acc_norm": 0.27860696517412936,
"acc_norm_stderr": 0.031700561834973086
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-virology|5": {
"acc": 0.29518072289156627,
"acc_stderr": 0.035509201856896294,
"acc_norm": 0.29518072289156627,
"acc_norm_stderr": 0.035509201856896294
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21542227662178703,
"mc1_stderr": 0.014391902652427688,
"mc2": 0.3805985213573371,
"mc2_stderr": 0.01395025708087029
},
"harness|winogrande|5": {
"acc": 0.5769534333070244,
"acc_stderr": 0.013885055359056476
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_h4rz3rk4s3__TinyParlaMintLlama-1.1B | open-llm-leaderboard-old | "2024-03-09T21:07:06Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T21:06:42Z" | ---
pretty_name: Evaluation run of h4rz3rk4s3/TinyParlaMintLlama-1.1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [h4rz3rk4s3/TinyParlaMintLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyParlaMintLlama-1.1B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_h4rz3rk4s3__TinyParlaMintLlama-1.1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T21:04:56.625451](https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyParlaMintLlama-1.1B/blob/main/results_2024-03-09T21-04-56.625451.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2534698094379658,\n\
\ \"acc_stderr\": 0.030711486886846922,\n \"acc_norm\": 0.2548347931678427,\n\
\ \"acc_norm_stderr\": 0.03152342469518155,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3881354794094613,\n\
\ \"mc2_stderr\": 0.014221616755531881\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.29266211604095566,\n \"acc_stderr\": 0.01329591610361941,\n\
\ \"acc_norm\": 0.3165529010238908,\n \"acc_norm_stderr\": 0.01359243151906808\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.42252539334793865,\n\
\ \"acc_stderr\": 0.004929517011508213,\n \"acc_norm\": 0.558653654650468,\n\
\ \"acc_norm_stderr\": 0.0049553302773042715\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2074074074074074,\n\
\ \"acc_stderr\": 0.03502553170678318,\n \"acc_norm\": 0.2074074074074074,\n\
\ \"acc_norm_stderr\": 0.03502553170678318\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19736842105263158,\n \"acc_stderr\": 0.03238981601699397,\n\
\ \"acc_norm\": 0.19736842105263158,\n \"acc_norm_stderr\": 0.03238981601699397\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.32,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.24150943396226415,\n \"acc_stderr\": 0.02634148037111837,\n\
\ \"acc_norm\": 0.24150943396226415,\n \"acc_norm_stderr\": 0.02634148037111837\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2916666666666667,\n\
\ \"acc_stderr\": 0.03800968060554858,\n \"acc_norm\": 0.2916666666666667,\n\
\ \"acc_norm_stderr\": 0.03800968060554858\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.19,\n \"acc_stderr\": 0.039427724440366234,\n \
\ \"acc_norm\": 0.19,\n \"acc_norm_stderr\": 0.039427724440366234\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165044,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165044\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.23121387283236994,\n\
\ \"acc_stderr\": 0.0321473730202947,\n \"acc_norm\": 0.23121387283236994,\n\
\ \"acc_norm_stderr\": 0.0321473730202947\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.22127659574468084,\n \"acc_stderr\": 0.027136349602424052,\n\
\ \"acc_norm\": 0.22127659574468084,\n \"acc_norm_stderr\": 0.027136349602424052\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2620689655172414,\n \"acc_stderr\": 0.036646663372252565,\n\
\ \"acc_norm\": 0.2620689655172414,\n \"acc_norm_stderr\": 0.036646663372252565\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.020940481565334866,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.020940481565334866\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.29365079365079366,\n\
\ \"acc_stderr\": 0.040735243221471276,\n \"acc_norm\": 0.29365079365079366,\n\
\ \"acc_norm_stderr\": 0.040735243221471276\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.20967741935483872,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.20967741935483872,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.1724137931034483,\n \"acc_stderr\": 0.02657767218303658,\n\
\ \"acc_norm\": 0.1724137931034483,\n \"acc_norm_stderr\": 0.02657767218303658\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.20202020202020202,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.20202020202020202,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565318,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565318\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.21025641025641026,\n \"acc_stderr\": 0.020660597485026928,\n\
\ \"acc_norm\": 0.21025641025641026,\n \"acc_norm_stderr\": 0.020660597485026928\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24074074074074073,\n \"acc_stderr\": 0.026067159222275794,\n \
\ \"acc_norm\": 0.24074074074074073,\n \"acc_norm_stderr\": 0.026067159222275794\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.027553614467863818,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.027553614467863818\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722727,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722727\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.029531221160930918,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.029531221160930918\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.03039153369274154\n \
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\"\
: 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \"\
acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2396694214876033,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.2396694214876033,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22085889570552147,\n \"acc_stderr\": 0.032591773927421776,\n\
\ \"acc_norm\": 0.22085889570552147,\n \"acc_norm_stderr\": 0.032591773927421776\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.1650485436893204,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.1650485436893204,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.028911208802749503,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.028911208802749503\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.26309067688378035,\n\
\ \"acc_stderr\": 0.01574549716904904,\n \"acc_norm\": 0.26309067688378035,\n\
\ \"acc_norm_stderr\": 0.01574549716904904\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.25139664804469275,\n\
\ \"acc_stderr\": 0.014508979453553962,\n \"acc_norm\": 0.25139664804469275,\n\
\ \"acc_norm_stderr\": 0.014508979453553962\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.023805186524888153,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.023805186524888153\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n\
\ \"acc_stderr\": 0.024723861504771686,\n \"acc_norm\": 0.2540192926045016,\n\
\ \"acc_norm_stderr\": 0.024723861504771686\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2654320987654321,\n \"acc_stderr\": 0.024569223600460845,\n\
\ \"acc_norm\": 0.2654320987654321,\n \"acc_norm_stderr\": 0.024569223600460845\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.19858156028368795,\n \"acc_stderr\": 0.023798301637942103,\n \
\ \"acc_norm\": 0.19858156028368795,\n \"acc_norm_stderr\": 0.023798301637942103\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.010946570966348768,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.010946570966348768\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.24632352941176472,\n \"acc_stderr\": 0.02617343857052,\n\
\ \"acc_norm\": 0.24632352941176472,\n \"acc_norm_stderr\": 0.02617343857052\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25980392156862747,\n \"acc_stderr\": 0.017740899509177788,\n \
\ \"acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.017740899509177788\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17551020408163265,\n \"acc_stderr\": 0.024352800722970015,\n\
\ \"acc_norm\": 0.17551020408163265,\n \"acc_norm_stderr\": 0.024352800722970015\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.22885572139303484,\n\
\ \"acc_stderr\": 0.029705284056772443,\n \"acc_norm\": 0.22885572139303484,\n\
\ \"acc_norm_stderr\": 0.029705284056772443\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370518,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370518\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.01481619599193158,\n \"mc2\": 0.3881354794094613,\n\
\ \"mc2_stderr\": 0.014221616755531881\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5864246250986582,\n \"acc_stderr\": 0.013840971763195306\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/h4rz3rk4s3/TinyParlaMintLlama-1.1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-04-56.625451.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-04-56.625451.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- '**/details_harness|winogrande|5_2024-03-09T21-04-56.625451.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T21-04-56.625451.parquet'
- config_name: results
data_files:
- split: 2024_03_09T21_04_56.625451
path:
- results_2024-03-09T21-04-56.625451.parquet
- split: latest
path:
- results_2024-03-09T21-04-56.625451.parquet
---
# Dataset Card for Evaluation run of h4rz3rk4s3/TinyParlaMintLlama-1.1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [h4rz3rk4s3/TinyParlaMintLlama-1.1B](https://huggingface.co/h4rz3rk4s3/TinyParlaMintLlama-1.1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_h4rz3rk4s3__TinyParlaMintLlama-1.1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T21:04:56.625451](https://huggingface.co/datasets/open-llm-leaderboard/details_h4rz3rk4s3__TinyParlaMintLlama-1.1B/blob/main/results_2024-03-09T21-04-56.625451.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2534698094379658,
"acc_stderr": 0.030711486886846922,
"acc_norm": 0.2548347931678427,
"acc_norm_stderr": 0.03152342469518155,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3881354794094613,
"mc2_stderr": 0.014221616755531881
},
"harness|arc:challenge|25": {
"acc": 0.29266211604095566,
"acc_stderr": 0.01329591610361941,
"acc_norm": 0.3165529010238908,
"acc_norm_stderr": 0.01359243151906808
},
"harness|hellaswag|10": {
"acc": 0.42252539334793865,
"acc_stderr": 0.004929517011508213,
"acc_norm": 0.558653654650468,
"acc_norm_stderr": 0.0049553302773042715
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2074074074074074,
"acc_stderr": 0.03502553170678318,
"acc_norm": 0.2074074074074074,
"acc_norm_stderr": 0.03502553170678318
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19736842105263158,
"acc_stderr": 0.03238981601699397,
"acc_norm": 0.19736842105263158,
"acc_norm_stderr": 0.03238981601699397
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.24150943396226415,
"acc_stderr": 0.02634148037111837,
"acc_norm": 0.24150943396226415,
"acc_norm_stderr": 0.02634148037111837
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.03800968060554858,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.03800968060554858
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.19,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.19,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165044,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165044
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.23121387283236994,
"acc_stderr": 0.0321473730202947,
"acc_norm": 0.23121387283236994,
"acc_norm_stderr": 0.0321473730202947
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.22127659574468084,
"acc_stderr": 0.027136349602424052,
"acc_norm": 0.22127659574468084,
"acc_norm_stderr": 0.027136349602424052
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2620689655172414,
"acc_stderr": 0.036646663372252565,
"acc_norm": 0.2620689655172414,
"acc_norm_stderr": 0.036646663372252565
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.020940481565334866,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.020940481565334866
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.29365079365079366,
"acc_stderr": 0.040735243221471276,
"acc_norm": 0.29365079365079366,
"acc_norm_stderr": 0.040735243221471276
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.20967741935483872,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.20967741935483872,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.1724137931034483,
"acc_stderr": 0.02657767218303658,
"acc_norm": 0.1724137931034483,
"acc_norm_stderr": 0.02657767218303658
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.20202020202020202,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.20202020202020202,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565318,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565318
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.21025641025641026,
"acc_stderr": 0.020660597485026928,
"acc_norm": 0.21025641025641026,
"acc_norm_stderr": 0.020660597485026928
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.026067159222275794,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.026067159222275794
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.027553614467863818,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.027553614467863818
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722727,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722727
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.25,
"acc_stderr": 0.029531221160930918,
"acc_norm": 0.25,
"acc_norm_stderr": 0.029531221160930918
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2396694214876033,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.2396694214876033,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22085889570552147,
"acc_stderr": 0.032591773927421776,
"acc_norm": 0.22085889570552147,
"acc_norm_stderr": 0.032591773927421776
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.1650485436893204,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.1650485436893204,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.028911208802749503,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.028911208802749503
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.26309067688378035,
"acc_stderr": 0.01574549716904904,
"acc_norm": 0.26309067688378035,
"acc_norm_stderr": 0.01574549716904904
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.25139664804469275,
"acc_stderr": 0.014508979453553962,
"acc_norm": 0.25139664804469275,
"acc_norm_stderr": 0.014508979453553962
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.023805186524888153,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.023805186524888153
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.024723861504771686,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.024723861504771686
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2654320987654321,
"acc_stderr": 0.024569223600460845,
"acc_norm": 0.2654320987654321,
"acc_norm_stderr": 0.024569223600460845
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.19858156028368795,
"acc_stderr": 0.023798301637942103,
"acc_norm": 0.19858156028368795,
"acc_norm_stderr": 0.023798301637942103
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348768,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348768
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.24632352941176472,
"acc_stderr": 0.02617343857052,
"acc_norm": 0.24632352941176472,
"acc_norm_stderr": 0.02617343857052
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.017740899509177788,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.017740899509177788
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17551020408163265,
"acc_stderr": 0.024352800722970015,
"acc_norm": 0.17551020408163265,
"acc_norm_stderr": 0.024352800722970015
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772443,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772443
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.01481619599193158,
"mc2": 0.3881354794094613,
"mc2_stderr": 0.014221616755531881
},
"harness|winogrande|5": {
"acc": 0.5864246250986582,
"acc_stderr": 0.013840971763195306
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Saiteja786/TestingDataset | Saiteja786 | "2024-03-09T21:25:59Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-09T21:15:53Z" | ---
license: apache-2.0
---
|
mxronga/yoruba-proverbs-parallel-corpora | mxronga | "2024-03-09T21:26:32Z" | 0 | 1 | [
"language:yo",
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"pretrain"
] | null | "2024-03-09T21:17:37Z" | ---
license: apache-2.0
language:
- yo
tags:
- pretrain
---
Parralel corpora for yoruba to english.
Source: http://yoruba.unl.edu/yoruba1.html |
open-llm-leaderboard-old/details_nlpguy__AlloyIngotNeoY | open-llm-leaderboard-old | "2024-03-09T21:32:12Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T21:31:52Z" | ---
pretty_name: Evaluation run of nlpguy/AlloyIngotNeoY
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nlpguy/AlloyIngotNeoY](https://huggingface.co/nlpguy/AlloyIngotNeoY) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nlpguy__AlloyIngotNeoY\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T21:29:38.615578](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngotNeoY/blob/main/results_2024-03-09T21-29-38.615578.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6499216650047986,\n\
\ \"acc_stderr\": 0.03206299874378752,\n \"acc_norm\": 0.6488512462508309,\n\
\ \"acc_norm_stderr\": 0.03273921602103512,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7798753603234657,\n\
\ \"mc2_stderr\": 0.013710462983276021\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7107508532423208,\n \"acc_stderr\": 0.013250012579393441,\n\
\ \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7175861382194781,\n\
\ \"acc_stderr\": 0.004492535748097628,\n \"acc_norm\": 0.8911571400119498,\n\
\ \"acc_norm_stderr\": 0.0031080545633521087\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.04810840148082636,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.04810840148082636\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621505,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621505\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.0303883535518868,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.0303883535518868\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8403669724770643,\n \"acc_stderr\": 0.015703498348461763,\n \"\
acc_norm\": 0.8403669724770643,\n \"acc_norm_stderr\": 0.015703498348461763\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278884,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278884\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.016898180706973878,\n \"mc2\": 0.7798753603234657,\n\
\ \"mc2_stderr\": 0.013710462983276021\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627297\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7028051554207733,\n \
\ \"acc_stderr\": 0.012588685966624187\n }\n}\n```"
repo_url: https://huggingface.co/nlpguy/AlloyIngotNeoY
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-29-38.615578.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-29-38.615578.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- '**/details_harness|winogrande|5_2024-03-09T21-29-38.615578.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T21-29-38.615578.parquet'
- config_name: results
data_files:
- split: 2024_03_09T21_29_38.615578
path:
- results_2024-03-09T21-29-38.615578.parquet
- split: latest
path:
- results_2024-03-09T21-29-38.615578.parquet
---
# Dataset Card for Evaluation run of nlpguy/AlloyIngotNeoY
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nlpguy/AlloyIngotNeoY](https://huggingface.co/nlpguy/AlloyIngotNeoY) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nlpguy__AlloyIngotNeoY",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T21:29:38.615578](https://huggingface.co/datasets/open-llm-leaderboard/details_nlpguy__AlloyIngotNeoY/blob/main/results_2024-03-09T21-29-38.615578.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6499216650047986,
"acc_stderr": 0.03206299874378752,
"acc_norm": 0.6488512462508309,
"acc_norm_stderr": 0.03273921602103512,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7798753603234657,
"mc2_stderr": 0.013710462983276021
},
"harness|arc:challenge|25": {
"acc": 0.7107508532423208,
"acc_stderr": 0.013250012579393441,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7175861382194781,
"acc_stderr": 0.004492535748097628,
"acc_norm": 0.8911571400119498,
"acc_norm_stderr": 0.0031080545633521087
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.04810840148082636,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.04810840148082636
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.0303883535518868,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.0303883535518868
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8403669724770643,
"acc_stderr": 0.015703498348461763,
"acc_norm": 0.8403669724770643,
"acc_norm_stderr": 0.015703498348461763
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278884,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278884
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.016898180706973878,
"mc2": 0.7798753603234657,
"mc2_stderr": 0.013710462983276021
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627297
},
"harness|gsm8k|5": {
"acc": 0.7028051554207733,
"acc_stderr": 0.012588685966624187
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Test157t__Eris-Daturamix-7b | open-llm-leaderboard-old | "2024-03-09T21:48:33Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T21:48:13Z" | ---
pretty_name: Evaluation run of Test157t/Eris-Daturamix-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Test157t/Eris-Daturamix-7b](https://huggingface.co/Test157t/Eris-Daturamix-7b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Test157t__Eris-Daturamix-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T21:45:58.845417](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Eris-Daturamix-7b/blob/main/results_2024-03-09T21-45-58.845417.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.651231382482504,\n\
\ \"acc_stderr\": 0.03225322364986081,\n \"acc_norm\": 0.6506076104981492,\n\
\ \"acc_norm_stderr\": 0.03293008865161104,\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.710468255381952,\n\
\ \"mc2_stderr\": 0.014907076684352403\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7064846416382252,\n \"acc_stderr\": 0.013307250444941113,\n\
\ \"acc_norm\": 0.7278156996587031,\n \"acc_norm_stderr\": 0.013006600406423702\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7189802828121888,\n\
\ \"acc_stderr\": 0.0044857844685766675,\n \"acc_norm\": 0.8822943636725752,\n\
\ \"acc_norm_stderr\": 0.0032160063577603803\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555497,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555497\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8131313131313131,\n \"acc_stderr\": 0.027772533334218967,\n \"\
acc_norm\": 0.8131313131313131,\n \"acc_norm_stderr\": 0.027772533334218967\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033456,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8330275229357799,\n \"acc_stderr\": 0.01599015488507338,\n \"\
acc_norm\": 0.8330275229357799,\n \"acc_norm_stderr\": 0.01599015488507338\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.02584501798692692,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.02584501798692692\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128137,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128137\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8186462324393359,\n\
\ \"acc_stderr\": 0.013778693778464076,\n \"acc_norm\": 0.8186462324393359,\n\
\ \"acc_norm_stderr\": 0.013778693778464076\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7398843930635838,\n \"acc_stderr\": 0.023618678310069367,\n\
\ \"acc_norm\": 0.7398843930635838,\n \"acc_norm_stderr\": 0.023618678310069367\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4122905027932961,\n\
\ \"acc_stderr\": 0.016463200238114525,\n \"acc_norm\": 0.4122905027932961,\n\
\ \"acc_norm_stderr\": 0.016463200238114525\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.0279715413701706,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.0279715413701706\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6650326797385621,\n \"acc_stderr\": 0.019094228167000325,\n \
\ \"acc_norm\": 0.6650326797385621,\n \"acc_norm_stderr\": 0.019094228167000325\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5495716034271726,\n\
\ \"mc1_stderr\": 0.01741726437196764,\n \"mc2\": 0.710468255381952,\n\
\ \"mc2_stderr\": 0.014907076684352403\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272962\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6770280515542078,\n \
\ \"acc_stderr\": 0.012880360794851806\n }\n}\n```"
repo_url: https://huggingface.co/Test157t/Eris-Daturamix-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-45-58.845417.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-45-58.845417.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- '**/details_harness|winogrande|5_2024-03-09T21-45-58.845417.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T21-45-58.845417.parquet'
- config_name: results
data_files:
- split: 2024_03_09T21_45_58.845417
path:
- results_2024-03-09T21-45-58.845417.parquet
- split: latest
path:
- results_2024-03-09T21-45-58.845417.parquet
---
# Dataset Card for Evaluation run of Test157t/Eris-Daturamix-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Test157t/Eris-Daturamix-7b](https://huggingface.co/Test157t/Eris-Daturamix-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Test157t__Eris-Daturamix-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T21:45:58.845417](https://huggingface.co/datasets/open-llm-leaderboard/details_Test157t__Eris-Daturamix-7b/blob/main/results_2024-03-09T21-45-58.845417.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.651231382482504,
"acc_stderr": 0.03225322364986081,
"acc_norm": 0.6506076104981492,
"acc_norm_stderr": 0.03293008865161104,
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.710468255381952,
"mc2_stderr": 0.014907076684352403
},
"harness|arc:challenge|25": {
"acc": 0.7064846416382252,
"acc_stderr": 0.013307250444941113,
"acc_norm": 0.7278156996587031,
"acc_norm_stderr": 0.013006600406423702
},
"harness|hellaswag|10": {
"acc": 0.7189802828121888,
"acc_stderr": 0.0044857844685766675,
"acc_norm": 0.8822943636725752,
"acc_norm_stderr": 0.0032160063577603803
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555497,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555497
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8131313131313131,
"acc_stderr": 0.027772533334218967,
"acc_norm": 0.8131313131313131,
"acc_norm_stderr": 0.027772533334218967
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033456,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8330275229357799,
"acc_stderr": 0.01599015488507338,
"acc_norm": 0.8330275229357799,
"acc_norm_stderr": 0.01599015488507338
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.02584501798692692,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.02584501798692692
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128137,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128137
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8186462324393359,
"acc_stderr": 0.013778693778464076,
"acc_norm": 0.8186462324393359,
"acc_norm_stderr": 0.013778693778464076
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7398843930635838,
"acc_stderr": 0.023618678310069367,
"acc_norm": 0.7398843930635838,
"acc_norm_stderr": 0.023618678310069367
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4122905027932961,
"acc_stderr": 0.016463200238114525,
"acc_norm": 0.4122905027932961,
"acc_norm_stderr": 0.016463200238114525
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.0279715413701706,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.0279715413701706
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6650326797385621,
"acc_stderr": 0.019094228167000325,
"acc_norm": 0.6650326797385621,
"acc_norm_stderr": 0.019094228167000325
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5495716034271726,
"mc1_stderr": 0.01741726437196764,
"mc2": 0.710468255381952,
"mc2_stderr": 0.014907076684352403
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272962
},
"harness|gsm8k|5": {
"acc": 0.6770280515542078,
"acc_stderr": 0.012880360794851806
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Corianas__NearalMistral-2x7B | open-llm-leaderboard-old | "2024-03-09T21:48:54Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T21:48:35Z" | ---
pretty_name: Evaluation run of Corianas/NearalMistral-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Corianas/NearalMistral-2x7B](https://huggingface.co/Corianas/NearalMistral-2x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Corianas__NearalMistral-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T21:46:15.698807](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__NearalMistral-2x7B/blob/main/results_2024-03-09T21-46-15.698807.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5636223801118392,\n\
\ \"acc_stderr\": 0.03386464675038975,\n \"acc_norm\": 0.5683727034445784,\n\
\ \"acc_norm_stderr\": 0.03457513981632913,\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.570299499009942,\n\
\ \"mc2_stderr\": 0.015436924382027407\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5401023890784983,\n \"acc_stderr\": 0.01456431885692485,\n\
\ \"acc_norm\": 0.5742320819112628,\n \"acc_norm_stderr\": 0.014449464278868807\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.589523999203346,\n\
\ \"acc_stderr\": 0.004909148239488275,\n \"acc_norm\": 0.7767377016530571,\n\
\ \"acc_norm_stderr\": 0.004155816900505153\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4222222222222222,\n\
\ \"acc_stderr\": 0.042667634040995814,\n \"acc_norm\": 0.4222222222222222,\n\
\ \"acc_norm_stderr\": 0.042667634040995814\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5592105263157895,\n \"acc_stderr\": 0.04040311062490435,\n\
\ \"acc_norm\": 0.5592105263157895,\n \"acc_norm_stderr\": 0.04040311062490435\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.029773082713319878,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.029773082713319878\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6388888888888888,\n\
\ \"acc_stderr\": 0.04016660030451233,\n \"acc_norm\": 0.6388888888888888,\n\
\ \"acc_norm_stderr\": 0.04016660030451233\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.30392156862745096,\n \"acc_stderr\": 0.04576665403207763,\n\
\ \"acc_norm\": 0.30392156862745096,\n \"acc_norm_stderr\": 0.04576665403207763\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.04605661864718381,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.04605661864718381\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4723404255319149,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.4723404255319149,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.045796394220704334,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.045796394220704334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878152,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878152\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.36772486772486773,\n \"acc_stderr\": 0.024833839825562413,\n \"\
acc_norm\": 0.36772486772486773,\n \"acc_norm_stderr\": 0.024833839825562413\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.043758884927270605,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.043758884927270605\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552735,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552735\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n\
\ \"acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\"\
: 0.56,\n \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.703030303030303,\n \"acc_stderr\": 0.035679697722680495,\n\
\ \"acc_norm\": 0.703030303030303,\n \"acc_norm_stderr\": 0.035679697722680495\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.03095405547036589,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.03095405547036589\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.031821550509166456,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.031821550509166456\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5128205128205128,\n \"acc_stderr\": 0.025342671293807257,\n\
\ \"acc_norm\": 0.5128205128205128,\n \"acc_norm_stderr\": 0.025342671293807257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085626,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085626\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5756302521008403,\n \"acc_stderr\": 0.032104790510157764,\n\
\ \"acc_norm\": 0.5756302521008403,\n \"acc_norm_stderr\": 0.032104790510157764\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7321100917431193,\n \"acc_stderr\": 0.018987462257978652,\n \"\
acc_norm\": 0.7321100917431193,\n \"acc_norm_stderr\": 0.018987462257978652\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n \"acc_norm\"\
: 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.7254901960784313,\n\
\ \"acc_stderr\": 0.031321798030832904,\n \"acc_norm\": 0.7254901960784313,\n\
\ \"acc_norm_stderr\": 0.031321798030832904\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.6919831223628692,\n \"acc_stderr\": 0.03005238933560571,\n\
\ \"acc_norm\": 0.6919831223628692,\n \"acc_norm_stderr\": 0.03005238933560571\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6143497757847534,\n\
\ \"acc_stderr\": 0.03266842214289201,\n \"acc_norm\": 0.6143497757847534,\n\
\ \"acc_norm_stderr\": 0.03266842214289201\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7107438016528925,\n \"acc_stderr\": 0.041391127276354626,\n \"\
acc_norm\": 0.7107438016528925,\n \"acc_norm_stderr\": 0.041391127276354626\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6666666666666666,\n\
\ \"acc_stderr\": 0.04557239513497752,\n \"acc_norm\": 0.6666666666666666,\n\
\ \"acc_norm_stderr\": 0.04557239513497752\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6993865030674846,\n \"acc_stderr\": 0.03602511318806771,\n\
\ \"acc_norm\": 0.6993865030674846,\n \"acc_norm_stderr\": 0.03602511318806771\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7471264367816092,\n\
\ \"acc_stderr\": 0.015543377313719681,\n \"acc_norm\": 0.7471264367816092,\n\
\ \"acc_norm_stderr\": 0.015543377313719681\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6329479768786127,\n \"acc_stderr\": 0.025950054337654082,\n\
\ \"acc_norm\": 0.6329479768786127,\n \"acc_norm_stderr\": 0.025950054337654082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27932960893854747,\n\
\ \"acc_stderr\": 0.015005762446786164,\n \"acc_norm\": 0.27932960893854747,\n\
\ \"acc_norm_stderr\": 0.015005762446786164\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6172839506172839,\n \"acc_stderr\": 0.027044538138402602,\n\
\ \"acc_norm\": 0.6172839506172839,\n \"acc_norm_stderr\": 0.027044538138402602\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.02909767559946393,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.02909767559946393\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4230769230769231,\n\
\ \"acc_stderr\": 0.012618204066588392,\n \"acc_norm\": 0.4230769230769231,\n\
\ \"acc_norm_stderr\": 0.012618204066588392\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5588235294117647,\n \"acc_stderr\": 0.030161911930767102,\n\
\ \"acc_norm\": 0.5588235294117647,\n \"acc_norm_stderr\": 0.030161911930767102\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.565359477124183,\n \"acc_stderr\": 0.02005426920072646,\n \
\ \"acc_norm\": 0.565359477124183,\n \"acc_norm_stderr\": 0.02005426920072646\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505417,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505417\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6530612244897959,\n \"acc_stderr\": 0.030472526026726492,\n\
\ \"acc_norm\": 0.6530612244897959,\n \"acc_norm_stderr\": 0.030472526026726492\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7860696517412935,\n\
\ \"acc_stderr\": 0.02899690969332891,\n \"acc_norm\": 0.7860696517412935,\n\
\ \"acc_norm_stderr\": 0.02899690969332891\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.76,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4578313253012048,\n\
\ \"acc_stderr\": 0.038786267710023595,\n \"acc_norm\": 0.4578313253012048,\n\
\ \"acc_norm_stderr\": 0.038786267710023595\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7192982456140351,\n \"acc_stderr\": 0.03446296217088427,\n\
\ \"acc_norm\": 0.7192982456140351,\n \"acc_norm_stderr\": 0.03446296217088427\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.570299499009942,\n\
\ \"mc2_stderr\": 0.015436924382027407\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865348\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3161485974222896,\n \
\ \"acc_stderr\": 0.012807630673451496\n }\n}\n```"
repo_url: https://huggingface.co/Corianas/NearalMistral-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-46-15.698807.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-46-15.698807.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- '**/details_harness|winogrande|5_2024-03-09T21-46-15.698807.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T21-46-15.698807.parquet'
- config_name: results
data_files:
- split: 2024_03_09T21_46_15.698807
path:
- results_2024-03-09T21-46-15.698807.parquet
- split: latest
path:
- results_2024-03-09T21-46-15.698807.parquet
---
# Dataset Card for Evaluation run of Corianas/NearalMistral-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Corianas/NearalMistral-2x7B](https://huggingface.co/Corianas/NearalMistral-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Corianas__NearalMistral-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T21:46:15.698807](https://huggingface.co/datasets/open-llm-leaderboard/details_Corianas__NearalMistral-2x7B/blob/main/results_2024-03-09T21-46-15.698807.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5636223801118392,
"acc_stderr": 0.03386464675038975,
"acc_norm": 0.5683727034445784,
"acc_norm_stderr": 0.03457513981632913,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.570299499009942,
"mc2_stderr": 0.015436924382027407
},
"harness|arc:challenge|25": {
"acc": 0.5401023890784983,
"acc_stderr": 0.01456431885692485,
"acc_norm": 0.5742320819112628,
"acc_norm_stderr": 0.014449464278868807
},
"harness|hellaswag|10": {
"acc": 0.589523999203346,
"acc_stderr": 0.004909148239488275,
"acc_norm": 0.7767377016530571,
"acc_norm_stderr": 0.004155816900505153
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4222222222222222,
"acc_stderr": 0.042667634040995814,
"acc_norm": 0.4222222222222222,
"acc_norm_stderr": 0.042667634040995814
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5592105263157895,
"acc_stderr": 0.04040311062490435,
"acc_norm": 0.5592105263157895,
"acc_norm_stderr": 0.04040311062490435
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.029773082713319878,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.029773082713319878
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.04016660030451233,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.04016660030451233
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.30392156862745096,
"acc_stderr": 0.04576665403207763,
"acc_norm": 0.30392156862745096,
"acc_norm_stderr": 0.04576665403207763
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4723404255319149,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.4723404255319149,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.045796394220704334,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.045796394220704334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878152,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878152
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.36772486772486773,
"acc_stderr": 0.024833839825562413,
"acc_norm": 0.36772486772486773,
"acc_norm_stderr": 0.024833839825562413
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.043758884927270605,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.043758884927270605
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552735,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552735
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.703030303030303,
"acc_stderr": 0.035679697722680495,
"acc_norm": 0.703030303030303,
"acc_norm_stderr": 0.035679697722680495
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.03095405547036589,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.03095405547036589
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.031821550509166456,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.031821550509166456
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5128205128205128,
"acc_stderr": 0.025342671293807257,
"acc_norm": 0.5128205128205128,
"acc_norm_stderr": 0.025342671293807257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085626,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085626
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5756302521008403,
"acc_stderr": 0.032104790510157764,
"acc_norm": 0.5756302521008403,
"acc_norm_stderr": 0.032104790510157764
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7321100917431193,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.7321100917431193,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.031321798030832904,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.031321798030832904
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6919831223628692,
"acc_stderr": 0.03005238933560571,
"acc_norm": 0.6919831223628692,
"acc_norm_stderr": 0.03005238933560571
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6143497757847534,
"acc_stderr": 0.03266842214289201,
"acc_norm": 0.6143497757847534,
"acc_norm_stderr": 0.03266842214289201
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7107438016528925,
"acc_stderr": 0.041391127276354626,
"acc_norm": 0.7107438016528925,
"acc_norm_stderr": 0.041391127276354626
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.04557239513497752,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.04557239513497752
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6993865030674846,
"acc_stderr": 0.03602511318806771,
"acc_norm": 0.6993865030674846,
"acc_norm_stderr": 0.03602511318806771
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7471264367816092,
"acc_stderr": 0.015543377313719681,
"acc_norm": 0.7471264367816092,
"acc_norm_stderr": 0.015543377313719681
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6329479768786127,
"acc_stderr": 0.025950054337654082,
"acc_norm": 0.6329479768786127,
"acc_norm_stderr": 0.025950054337654082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27932960893854747,
"acc_stderr": 0.015005762446786164,
"acc_norm": 0.27932960893854747,
"acc_norm_stderr": 0.015005762446786164
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6172839506172839,
"acc_stderr": 0.027044538138402602,
"acc_norm": 0.6172839506172839,
"acc_norm_stderr": 0.027044538138402602
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.02909767559946393,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.02909767559946393
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4230769230769231,
"acc_stderr": 0.012618204066588392,
"acc_norm": 0.4230769230769231,
"acc_norm_stderr": 0.012618204066588392
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5588235294117647,
"acc_stderr": 0.030161911930767102,
"acc_norm": 0.5588235294117647,
"acc_norm_stderr": 0.030161911930767102
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.565359477124183,
"acc_stderr": 0.02005426920072646,
"acc_norm": 0.565359477124183,
"acc_norm_stderr": 0.02005426920072646
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.04582004841505417,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.04582004841505417
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6530612244897959,
"acc_stderr": 0.030472526026726492,
"acc_norm": 0.6530612244897959,
"acc_norm_stderr": 0.030472526026726492
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7860696517412935,
"acc_stderr": 0.02899690969332891,
"acc_norm": 0.7860696517412935,
"acc_norm_stderr": 0.02899690969332891
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4578313253012048,
"acc_stderr": 0.038786267710023595,
"acc_norm": 0.4578313253012048,
"acc_norm_stderr": 0.038786267710023595
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7192982456140351,
"acc_stderr": 0.03446296217088427,
"acc_norm": 0.7192982456140351,
"acc_norm_stderr": 0.03446296217088427
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.570299499009942,
"mc2_stderr": 0.015436924382027407
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.012134386019865348
},
"harness|gsm8k|5": {
"acc": 0.3161485974222896,
"acc_stderr": 0.012807630673451496
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_pabloce__Dolphin-2.8-slerp | open-llm-leaderboard-old | "2024-03-15T23:10:55Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T21:53:51Z" | ---
pretty_name: Evaluation run of pabloce/Dolphin-2.8-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [pabloce/Dolphin-2.8-slerp](https://huggingface.co/pabloce/Dolphin-2.8-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_pabloce__Dolphin-2.8-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-15T23:08:17.610047](https://huggingface.co/datasets/open-llm-leaderboard/details_pabloce__Dolphin-2.8-slerp/blob/main/results_2024-03-15T23-08-17.610047.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.648123129371981,\n\
\ \"acc_stderr\": 0.03218045054384059,\n \"acc_norm\": 0.6481813544914691,\n\
\ \"acc_norm_stderr\": 0.03284542152986091,\n \"mc1\": 0.47980416156670747,\n\
\ \"mc1_stderr\": 0.01748921684973705,\n \"mc2\": 0.6519707532329383,\n\
\ \"mc2_stderr\": 0.015190095767666237\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.643344709897611,\n \"acc_stderr\": 0.013998056902620199,\n\
\ \"acc_norm\": 0.6800341296928327,\n \"acc_norm_stderr\": 0.01363134580701619\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6868153754232225,\n\
\ \"acc_stderr\": 0.004628409084218761,\n \"acc_norm\": 0.8650667197769368,\n\
\ \"acc_norm_stderr\": 0.003409540533249836\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6222222222222222,\n\
\ \"acc_stderr\": 0.04188307537595853,\n \"acc_norm\": 0.6222222222222222,\n\
\ \"acc_norm_stderr\": 0.04188307537595853\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.02854479331905533,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.02854479331905533\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7986111111111112,\n\
\ \"acc_stderr\": 0.03353647469713839,\n \"acc_norm\": 0.7986111111111112,\n\
\ \"acc_norm_stderr\": 0.03353647469713839\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404904,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404904\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328972,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328972\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6974358974358974,\n \"acc_stderr\": 0.023290888053772725,\n\
\ \"acc_norm\": 0.6974358974358974,\n \"acc_norm_stderr\": 0.023290888053772725\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34444444444444444,\n \"acc_stderr\": 0.02897264888484427,\n \
\ \"acc_norm\": 0.34444444444444444,\n \"acc_norm_stderr\": 0.02897264888484427\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566545,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566545\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3973509933774834,\n \"acc_stderr\": 0.0399552400768168,\n \"acc_norm\"\
: 0.3973509933774834,\n \"acc_norm_stderr\": 0.0399552400768168\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8293577981651377,\n\
\ \"acc_stderr\": 0.01612927102509986,\n \"acc_norm\": 0.8293577981651377,\n\
\ \"acc_norm_stderr\": 0.01612927102509986\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.02759917430064077,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.02759917430064077\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.025530100460233504,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.025530100460233504\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624734,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624734\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.03487825168497892,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.03487825168497892\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690877,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690877\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.013890862162876168,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.013890862162876168\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3653631284916201,\n\
\ \"acc_stderr\": 0.01610483388014229,\n \"acc_norm\": 0.3653631284916201,\n\
\ \"acc_norm_stderr\": 0.01610483388014229\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7091503267973857,\n \"acc_stderr\": 0.02600480036395213,\n\
\ \"acc_norm\": 0.7091503267973857,\n \"acc_norm_stderr\": 0.02600480036395213\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188936,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188936\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7314814814814815,\n \"acc_stderr\": 0.02465968518596728,\n\
\ \"acc_norm\": 0.7314814814814815,\n \"acc_norm_stderr\": 0.02465968518596728\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48936170212765956,\n \"acc_stderr\": 0.029820747191422473,\n \
\ \"acc_norm\": 0.48936170212765956,\n \"acc_norm_stderr\": 0.029820747191422473\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46936114732724904,\n\
\ \"acc_stderr\": 0.012746237711716634,\n \"acc_norm\": 0.46936114732724904,\n\
\ \"acc_norm_stderr\": 0.012746237711716634\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.028245687391462923,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.028245687391462923\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6715686274509803,\n \"acc_stderr\": 0.018999707383162673,\n \
\ \"acc_norm\": 0.6715686274509803,\n \"acc_norm_stderr\": 0.018999707383162673\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.028535560337128445,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.028535560337128445\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.02991312723236804,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.02991312723236804\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.47980416156670747,\n\
\ \"mc1_stderr\": 0.01748921684973705,\n \"mc2\": 0.6519707532329383,\n\
\ \"mc2_stderr\": 0.015190095767666237\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8216258879242304,\n \"acc_stderr\": 0.01075935201485593\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.686125852918878,\n \
\ \"acc_stderr\": 0.012782681251053194\n }\n}\n```"
repo_url: https://huggingface.co/pabloce/Dolphin-2.8-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|arc:challenge|25_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|gsm8k|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hellaswag|10_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-51-37.101802.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T23-08-17.610047.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-15T23-08-17.610047.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- '**/details_harness|winogrande|5_2024-03-09T21-51-37.101802.parquet'
- split: 2024_03_15T23_08_17.610047
path:
- '**/details_harness|winogrande|5_2024-03-15T23-08-17.610047.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-15T23-08-17.610047.parquet'
- config_name: results
data_files:
- split: 2024_03_09T21_51_37.101802
path:
- results_2024-03-09T21-51-37.101802.parquet
- split: 2024_03_15T23_08_17.610047
path:
- results_2024-03-15T23-08-17.610047.parquet
- split: latest
path:
- results_2024-03-15T23-08-17.610047.parquet
---
# Dataset Card for Evaluation run of pabloce/Dolphin-2.8-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [pabloce/Dolphin-2.8-slerp](https://huggingface.co/pabloce/Dolphin-2.8-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_pabloce__Dolphin-2.8-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-15T23:08:17.610047](https://huggingface.co/datasets/open-llm-leaderboard/details_pabloce__Dolphin-2.8-slerp/blob/main/results_2024-03-15T23-08-17.610047.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.648123129371981,
"acc_stderr": 0.03218045054384059,
"acc_norm": 0.6481813544914691,
"acc_norm_stderr": 0.03284542152986091,
"mc1": 0.47980416156670747,
"mc1_stderr": 0.01748921684973705,
"mc2": 0.6519707532329383,
"mc2_stderr": 0.015190095767666237
},
"harness|arc:challenge|25": {
"acc": 0.643344709897611,
"acc_stderr": 0.013998056902620199,
"acc_norm": 0.6800341296928327,
"acc_norm_stderr": 0.01363134580701619
},
"harness|hellaswag|10": {
"acc": 0.6868153754232225,
"acc_stderr": 0.004628409084218761,
"acc_norm": 0.8650667197769368,
"acc_norm_stderr": 0.003409540533249836
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6222222222222222,
"acc_stderr": 0.04188307537595853,
"acc_norm": 0.6222222222222222,
"acc_norm_stderr": 0.04188307537595853
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.02854479331905533,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.02854479331905533
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7986111111111112,
"acc_stderr": 0.03353647469713839,
"acc_norm": 0.7986111111111112,
"acc_norm_stderr": 0.03353647469713839
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404904,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404904
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328972,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328972
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6974358974358974,
"acc_stderr": 0.023290888053772725,
"acc_norm": 0.6974358974358974,
"acc_norm_stderr": 0.023290888053772725
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34444444444444444,
"acc_stderr": 0.02897264888484427,
"acc_norm": 0.34444444444444444,
"acc_norm_stderr": 0.02897264888484427
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566545,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566545
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3973509933774834,
"acc_stderr": 0.0399552400768168,
"acc_norm": 0.3973509933774834,
"acc_norm_stderr": 0.0399552400768168
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8293577981651377,
"acc_stderr": 0.01612927102509986,
"acc_norm": 0.8293577981651377,
"acc_norm_stderr": 0.01612927102509986
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.02759917430064077,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.02759917430064077
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.025530100460233504,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.025530100460233504
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624734,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624734
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.03487825168497892,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.03487825168497892
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690877,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690877
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.013890862162876168,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.013890862162876168
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3653631284916201,
"acc_stderr": 0.01610483388014229,
"acc_norm": 0.3653631284916201,
"acc_norm_stderr": 0.01610483388014229
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7091503267973857,
"acc_stderr": 0.02600480036395213,
"acc_norm": 0.7091503267973857,
"acc_norm_stderr": 0.02600480036395213
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188936,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188936
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.02465968518596728,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.02465968518596728
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48936170212765956,
"acc_stderr": 0.029820747191422473,
"acc_norm": 0.48936170212765956,
"acc_norm_stderr": 0.029820747191422473
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46936114732724904,
"acc_stderr": 0.012746237711716634,
"acc_norm": 0.46936114732724904,
"acc_norm_stderr": 0.012746237711716634
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.028245687391462923,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.028245687391462923
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6715686274509803,
"acc_stderr": 0.018999707383162673,
"acc_norm": 0.6715686274509803,
"acc_norm_stderr": 0.018999707383162673
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.028535560337128445,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.028535560337128445
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.02991312723236804,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.02991312723236804
},
"harness|truthfulqa:mc|0": {
"mc1": 0.47980416156670747,
"mc1_stderr": 0.01748921684973705,
"mc2": 0.6519707532329383,
"mc2_stderr": 0.015190095767666237
},
"harness|winogrande|5": {
"acc": 0.8216258879242304,
"acc_stderr": 0.01075935201485593
},
"harness|gsm8k|5": {
"acc": 0.686125852918878,
"acc_stderr": 0.012782681251053194
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Abhaykoul__MediKAI | open-llm-leaderboard-old | "2024-03-09T21:59:44Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T21:59:25Z" | ---
pretty_name: Evaluation run of Abhaykoul/MediKAI
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Abhaykoul/MediKAI](https://huggingface.co/Abhaykoul/MediKAI) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Abhaykoul__MediKAI\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T21:57:21.649294](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhaykoul__MediKAI/blob/main/results_2024-03-09T21-57-21.649294.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.4852558560383581,\n\
\ \"acc_stderr\": 0.03436927411417114,\n \"acc_norm\": 0.49445469772933043,\n\
\ \"acc_norm_stderr\": 0.035245948661633913,\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.48774494292294535,\n\
\ \"mc2_stderr\": 0.016662118503392984\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.4138225255972696,\n \"acc_stderr\": 0.014392730009221012,\n\
\ \"acc_norm\": 0.46501706484641636,\n \"acc_norm_stderr\": 0.014575583922019667\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.45140410276837284,\n\
\ \"acc_stderr\": 0.004966158142645411,\n \"acc_norm\": 0.6055566620195181,\n\
\ \"acc_norm_stderr\": 0.00487731968363908\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.04316378599511324,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.04316378599511324\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.47368421052631576,\n \"acc_stderr\": 0.04063302731486671,\n\
\ \"acc_norm\": 0.47368421052631576,\n \"acc_norm_stderr\": 0.04063302731486671\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.48,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.48,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5245283018867924,\n \"acc_stderr\": 0.030735822206205608,\n\
\ \"acc_norm\": 0.5245283018867924,\n \"acc_norm_stderr\": 0.030735822206205608\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5486111111111112,\n\
\ \"acc_stderr\": 0.04161402398403279,\n \"acc_norm\": 0.5486111111111112,\n\
\ \"acc_norm_stderr\": 0.04161402398403279\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.44,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.44,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.48554913294797686,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.48554913294797686,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.43829787234042555,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.43829787234042555,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3253968253968254,\n\
\ \"acc_stderr\": 0.041905964388711366,\n \"acc_norm\": 0.3253968253968254,\n\
\ \"acc_norm_stderr\": 0.041905964388711366\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117317,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117317\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.02737987122994324,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.02737987122994324\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.3399014778325123,\n \"acc_stderr\": 0.033327690684107895,\n\
\ \"acc_norm\": 0.3399014778325123,\n \"acc_norm_stderr\": 0.033327690684107895\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.509090909090909,\n \"acc_stderr\": 0.03903698647748441,\n\
\ \"acc_norm\": 0.509090909090909,\n \"acc_norm_stderr\": 0.03903698647748441\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.6515151515151515,\n \"acc_stderr\": 0.03394853965156402,\n \"\
acc_norm\": 0.6515151515151515,\n \"acc_norm_stderr\": 0.03394853965156402\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.6010362694300518,\n \"acc_stderr\": 0.03533999094065696,\n\
\ \"acc_norm\": 0.6010362694300518,\n \"acc_norm_stderr\": 0.03533999094065696\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.441025641025641,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.441025641025641,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.27037037037037037,\n \"acc_stderr\": 0.027080372815145658,\n \
\ \"acc_norm\": 0.27037037037037037,\n \"acc_norm_stderr\": 0.027080372815145658\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.4957983193277311,\n \"acc_stderr\": 0.03247734334448111,\n \
\ \"acc_norm\": 0.4957983193277311,\n \"acc_norm_stderr\": 0.03247734334448111\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.691743119266055,\n \"acc_stderr\": 0.01979836669836724,\n \"acc_norm\"\
: 0.691743119266055,\n \"acc_norm_stderr\": 0.01979836669836724\n },\n\
\ \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\": 0.3194444444444444,\n\
\ \"acc_stderr\": 0.0317987634217685,\n \"acc_norm\": 0.3194444444444444,\n\
\ \"acc_norm_stderr\": 0.0317987634217685\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.03476099060501636,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.03476099060501636\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.6329113924050633,\n \"acc_stderr\": 0.031376240725616185,\n \
\ \"acc_norm\": 0.6329113924050633,\n \"acc_norm_stderr\": 0.031376240725616185\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.5650224215246636,\n\
\ \"acc_stderr\": 0.033272833702713445,\n \"acc_norm\": 0.5650224215246636,\n\
\ \"acc_norm_stderr\": 0.033272833702713445\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6259541984732825,\n \"acc_stderr\": 0.04243869242230524,\n\
\ \"acc_norm\": 0.6259541984732825,\n \"acc_norm_stderr\": 0.04243869242230524\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6033057851239669,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.6033057851239669,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5740740740740741,\n\
\ \"acc_stderr\": 0.0478034362693679,\n \"acc_norm\": 0.5740740740740741,\n\
\ \"acc_norm_stderr\": 0.0478034362693679\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.4785276073619632,\n \"acc_stderr\": 0.0392474687675113,\n\
\ \"acc_norm\": 0.4785276073619632,\n \"acc_norm_stderr\": 0.0392474687675113\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.375,\n\
\ \"acc_stderr\": 0.04595091388086298,\n \"acc_norm\": 0.375,\n \
\ \"acc_norm_stderr\": 0.04595091388086298\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6504854368932039,\n \"acc_stderr\": 0.047211885060971716,\n\
\ \"acc_norm\": 0.6504854368932039,\n \"acc_norm_stderr\": 0.047211885060971716\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.632183908045977,\n\
\ \"acc_stderr\": 0.017243828891846263,\n \"acc_norm\": 0.632183908045977,\n\
\ \"acc_norm_stderr\": 0.017243828891846263\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.45664739884393063,\n \"acc_stderr\": 0.026817718130348923,\n\
\ \"acc_norm\": 0.45664739884393063,\n \"acc_norm_stderr\": 0.026817718130348923\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.26145251396648045,\n\
\ \"acc_stderr\": 0.014696599650364557,\n \"acc_norm\": 0.26145251396648045,\n\
\ \"acc_norm_stderr\": 0.014696599650364557\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5686274509803921,\n \"acc_stderr\": 0.02835895631342354,\n\
\ \"acc_norm\": 0.5686274509803921,\n \"acc_norm_stderr\": 0.02835895631342354\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.5112540192926045,\n\
\ \"acc_stderr\": 0.028390897396863533,\n \"acc_norm\": 0.5112540192926045,\n\
\ \"acc_norm_stderr\": 0.028390897396863533\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5339506172839507,\n \"acc_stderr\": 0.027756535257347663,\n\
\ \"acc_norm\": 0.5339506172839507,\n \"acc_norm_stderr\": 0.027756535257347663\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.37943262411347517,\n \"acc_stderr\": 0.028947338851614105,\n \
\ \"acc_norm\": 0.37943262411347517,\n \"acc_norm_stderr\": 0.028947338851614105\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3363754889178618,\n\
\ \"acc_stderr\": 0.012067083079452222,\n \"acc_norm\": 0.3363754889178618,\n\
\ \"acc_norm_stderr\": 0.012067083079452222\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.030187532060329394,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.030187532060329394\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.4869281045751634,\n \"acc_stderr\": 0.020220920829626916,\n \
\ \"acc_norm\": 0.4869281045751634,\n \"acc_norm_stderr\": 0.020220920829626916\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.5454545454545454,\n\
\ \"acc_stderr\": 0.04769300568972745,\n \"acc_norm\": 0.5454545454545454,\n\
\ \"acc_norm_stderr\": 0.04769300568972745\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5591836734693878,\n \"acc_stderr\": 0.03178419114175363,\n\
\ \"acc_norm\": 0.5591836734693878,\n \"acc_norm_stderr\": 0.03178419114175363\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6467661691542289,\n\
\ \"acc_stderr\": 0.03379790611796777,\n \"acc_norm\": 0.6467661691542289,\n\
\ \"acc_norm_stderr\": 0.03379790611796777\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.57,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26560587515299877,\n\
\ \"mc1_stderr\": 0.015461027627253595,\n \"mc2\": 0.48774494292294535,\n\
\ \"mc2_stderr\": 0.016662118503392984\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6172059984214681,\n \"acc_stderr\": 0.013660946109442004\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.003447819272388992\n }\n}\n```"
repo_url: https://huggingface.co/Abhaykoul/MediKAI
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-57-21.649294.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T21-57-21.649294.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- '**/details_harness|winogrande|5_2024-03-09T21-57-21.649294.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T21-57-21.649294.parquet'
- config_name: results
data_files:
- split: 2024_03_09T21_57_21.649294
path:
- results_2024-03-09T21-57-21.649294.parquet
- split: latest
path:
- results_2024-03-09T21-57-21.649294.parquet
---
# Dataset Card for Evaluation run of Abhaykoul/MediKAI
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Abhaykoul/MediKAI](https://huggingface.co/Abhaykoul/MediKAI) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Abhaykoul__MediKAI",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T21:57:21.649294](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhaykoul__MediKAI/blob/main/results_2024-03-09T21-57-21.649294.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.4852558560383581,
"acc_stderr": 0.03436927411417114,
"acc_norm": 0.49445469772933043,
"acc_norm_stderr": 0.035245948661633913,
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.48774494292294535,
"mc2_stderr": 0.016662118503392984
},
"harness|arc:challenge|25": {
"acc": 0.4138225255972696,
"acc_stderr": 0.014392730009221012,
"acc_norm": 0.46501706484641636,
"acc_norm_stderr": 0.014575583922019667
},
"harness|hellaswag|10": {
"acc": 0.45140410276837284,
"acc_stderr": 0.004966158142645411,
"acc_norm": 0.6055566620195181,
"acc_norm_stderr": 0.00487731968363908
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.04316378599511324,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.04316378599511324
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.04063302731486671,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.04063302731486671
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5245283018867924,
"acc_stderr": 0.030735822206205608,
"acc_norm": 0.5245283018867924,
"acc_norm_stderr": 0.030735822206205608
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5486111111111112,
"acc_stderr": 0.04161402398403279,
"acc_norm": 0.5486111111111112,
"acc_norm_stderr": 0.04161402398403279
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.48554913294797686,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.48554913294797686,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.43829787234042555,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.43829787234042555,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3253968253968254,
"acc_stderr": 0.041905964388711366,
"acc_norm": 0.3253968253968254,
"acc_norm_stderr": 0.041905964388711366
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117317,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117317
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.02737987122994324,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.02737987122994324
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.3399014778325123,
"acc_stderr": 0.033327690684107895,
"acc_norm": 0.3399014778325123,
"acc_norm_stderr": 0.033327690684107895
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.509090909090909,
"acc_stderr": 0.03903698647748441,
"acc_norm": 0.509090909090909,
"acc_norm_stderr": 0.03903698647748441
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.6515151515151515,
"acc_stderr": 0.03394853965156402,
"acc_norm": 0.6515151515151515,
"acc_norm_stderr": 0.03394853965156402
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.6010362694300518,
"acc_stderr": 0.03533999094065696,
"acc_norm": 0.6010362694300518,
"acc_norm_stderr": 0.03533999094065696
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.441025641025641,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.441025641025641,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.27037037037037037,
"acc_stderr": 0.027080372815145658,
"acc_norm": 0.27037037037037037,
"acc_norm_stderr": 0.027080372815145658
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.4957983193277311,
"acc_stderr": 0.03247734334448111,
"acc_norm": 0.4957983193277311,
"acc_norm_stderr": 0.03247734334448111
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.691743119266055,
"acc_stderr": 0.01979836669836724,
"acc_norm": 0.691743119266055,
"acc_norm_stderr": 0.01979836669836724
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.3194444444444444,
"acc_stderr": 0.0317987634217685,
"acc_norm": 0.3194444444444444,
"acc_norm_stderr": 0.0317987634217685
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.03476099060501636,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.03476099060501636
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6329113924050633,
"acc_stderr": 0.031376240725616185,
"acc_norm": 0.6329113924050633,
"acc_norm_stderr": 0.031376240725616185
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.5650224215246636,
"acc_stderr": 0.033272833702713445,
"acc_norm": 0.5650224215246636,
"acc_norm_stderr": 0.033272833702713445
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6259541984732825,
"acc_stderr": 0.04243869242230524,
"acc_norm": 0.6259541984732825,
"acc_norm_stderr": 0.04243869242230524
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6033057851239669,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.6033057851239669,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5740740740740741,
"acc_stderr": 0.0478034362693679,
"acc_norm": 0.5740740740740741,
"acc_norm_stderr": 0.0478034362693679
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.4785276073619632,
"acc_stderr": 0.0392474687675113,
"acc_norm": 0.4785276073619632,
"acc_norm_stderr": 0.0392474687675113
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.375,
"acc_stderr": 0.04595091388086298,
"acc_norm": 0.375,
"acc_norm_stderr": 0.04595091388086298
},
"harness|hendrycksTest-management|5": {
"acc": 0.6504854368932039,
"acc_stderr": 0.047211885060971716,
"acc_norm": 0.6504854368932039,
"acc_norm_stderr": 0.047211885060971716
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.632183908045977,
"acc_stderr": 0.017243828891846263,
"acc_norm": 0.632183908045977,
"acc_norm_stderr": 0.017243828891846263
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.45664739884393063,
"acc_stderr": 0.026817718130348923,
"acc_norm": 0.45664739884393063,
"acc_norm_stderr": 0.026817718130348923
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.26145251396648045,
"acc_stderr": 0.014696599650364557,
"acc_norm": 0.26145251396648045,
"acc_norm_stderr": 0.014696599650364557
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5686274509803921,
"acc_stderr": 0.02835895631342354,
"acc_norm": 0.5686274509803921,
"acc_norm_stderr": 0.02835895631342354
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.5112540192926045,
"acc_stderr": 0.028390897396863533,
"acc_norm": 0.5112540192926045,
"acc_norm_stderr": 0.028390897396863533
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5339506172839507,
"acc_stderr": 0.027756535257347663,
"acc_norm": 0.5339506172839507,
"acc_norm_stderr": 0.027756535257347663
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.37943262411347517,
"acc_stderr": 0.028947338851614105,
"acc_norm": 0.37943262411347517,
"acc_norm_stderr": 0.028947338851614105
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3363754889178618,
"acc_stderr": 0.012067083079452222,
"acc_norm": 0.3363754889178618,
"acc_norm_stderr": 0.012067083079452222
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.030187532060329394,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.030187532060329394
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.4869281045751634,
"acc_stderr": 0.020220920829626916,
"acc_norm": 0.4869281045751634,
"acc_norm_stderr": 0.020220920829626916
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.5454545454545454,
"acc_stderr": 0.04769300568972745,
"acc_norm": 0.5454545454545454,
"acc_norm_stderr": 0.04769300568972745
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5591836734693878,
"acc_stderr": 0.03178419114175363,
"acc_norm": 0.5591836734693878,
"acc_norm_stderr": 0.03178419114175363
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6467661691542289,
"acc_stderr": 0.03379790611796777,
"acc_norm": 0.6467661691542289,
"acc_norm_stderr": 0.03379790611796777
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26560587515299877,
"mc1_stderr": 0.015461027627253595,
"mc2": 0.48774494292294535,
"mc2_stderr": 0.016662118503392984
},
"harness|winogrande|5": {
"acc": 0.6172059984214681,
"acc_stderr": 0.013660946109442004
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.003447819272388992
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
arjunvb/drivetrack | arjunvb | "2024-03-15T02:59:51Z" | 0 | 1 | [
"license:mit",
"region:us"
] | null | "2024-03-09T21:59:35Z" | ---
license: mit
---
|
ambrosfitz/ps_data_2 | ambrosfitz | "2024-03-09T22:08:37Z" | 0 | 0 | [
"task_categories:text-generation",
"task_categories:question-answering",
"language:en",
"license:cc-by-3.0",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"US history - primary sources"
] | [
"text-generation",
"question-answering"
] | "2024-03-09T22:03:53Z" | ---
license: cc-by-3.0
task_categories:
- text-generation
- question-answering
language:
- en
tags:
- US history
- primary sources
size_categories:
- 1K<n<10K
---
This is a collection of primary source documents from American History. Analyzed by GPT-3.5-Turbo for context, generates a question, and then provides an answer. |
Ubaidbhat/databaseBenchmarkQA | Ubaidbhat | "2024-03-09T22:29:29Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-09T22:29:28Z" | ---
dataset_info:
features:
- name: context
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
- name: source_doc
dtype: string
- name: groundedness_score
dtype: int64
- name: relevance_score
dtype: int64
splits:
- name: train
num_bytes: 513993
num_examples: 263
download_size: 268275
dataset_size: 513993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
alexredna/only_mixtral-8x7b-instruct-v0.1-chosen | alexredna | "2024-03-09T22:40:03Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-09T22:38:32Z" | ---
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
dataset_info:
features:
- name: source
dtype: string
- name: category
dtype: string
- name: prompt
dtype: string
- name: messages
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 428379537.6951206
num_examples: 189282
download_size: 240185969
dataset_size: 428379537.6951206
---
# Dataset Card for "only_mixtral-8x7b-instruct-v0.1-chosen"
[More Information needed](https://github.com/huggingface/datasets/blob/main/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards) |
BambiMC/ts_test | BambiMC | "2024-03-11T09:57:27Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-09T22:52:38Z" | ---
license: mit
---
|
open-llm-leaderboard-old/details_ziniuli__Mistral-7B-ReMax-v0.1 | open-llm-leaderboard-old | "2024-03-11T18:10:35Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:07:25Z" | ---
pretty_name: Evaluation run of ziniuli/Mistral-7B-ReMax-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ziniuli/Mistral-7B-ReMax-v0.1](https://huggingface.co/ziniuli/Mistral-7B-ReMax-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T18:07:55.437908](https://huggingface.co/datasets/open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1/blob/main/results_2024-03-11T18-07-55.437908.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6076779028291911,\n\
\ \"acc_stderr\": 0.03315115503196382,\n \"acc_norm\": 0.61214149482687,\n\
\ \"acc_norm_stderr\": 0.03382378734979988,\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6815516476213737,\n\
\ \"mc2_stderr\": 0.015177768821414346\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5964163822525598,\n \"acc_stderr\": 0.014337158914268445,\n\
\ \"acc_norm\": 0.6331058020477816,\n \"acc_norm_stderr\": 0.014084133118104301\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6692889862577176,\n\
\ \"acc_stderr\": 0.004695076629884538,\n \"acc_norm\": 0.8498307110137423,\n\
\ \"acc_norm_stderr\": 0.0035650718701954473\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6118421052631579,\n \"acc_stderr\": 0.03965842097512744,\n\
\ \"acc_norm\": 0.6118421052631579,\n \"acc_norm_stderr\": 0.03965842097512744\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6875,\n\
\ \"acc_stderr\": 0.038760854559127644,\n \"acc_norm\": 0.6875,\n\
\ \"acc_norm_stderr\": 0.038760854559127644\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5953757225433526,\n\
\ \"acc_stderr\": 0.03742461193887248,\n \"acc_norm\": 0.5953757225433526,\n\
\ \"acc_norm_stderr\": 0.03742461193887248\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5361702127659574,\n \"acc_stderr\": 0.032600385118357715,\n\
\ \"acc_norm\": 0.5361702127659574,\n \"acc_norm_stderr\": 0.032600385118357715\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.38095238095238093,\n \"acc_stderr\": 0.025010749116137602,\n \"\
acc_norm\": 0.38095238095238093,\n \"acc_norm_stderr\": 0.025010749116137602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.02578772318072387,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.02578772318072387\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.558974358974359,\n \"acc_stderr\": 0.025174048384000745,\n \
\ \"acc_norm\": 0.558974358974359,\n \"acc_norm_stderr\": 0.025174048384000745\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.02822644674968351,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.02822644674968351\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059278,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059278\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.038969819642573754,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.038969819642573754\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.017266742087630797,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.017266742087630797\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044812,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044812\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.04726835553719099,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.04726835553719099\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326466,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326466\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \
\ \"acc_norm\": 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.014866821664709583,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.014866821664709583\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348408,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348408\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046626,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046626\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.026003301117885135,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.026003301117885135\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.691358024691358,\n \"acc_stderr\": 0.025702640260603746,\n\
\ \"acc_norm\": 0.691358024691358,\n \"acc_norm_stderr\": 0.025702640260603746\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43415906127770537,\n\
\ \"acc_stderr\": 0.012659033237067248,\n \"acc_norm\": 0.43415906127770537,\n\
\ \"acc_norm_stderr\": 0.012659033237067248\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.02952009569768776,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.02952009569768776\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6421568627450981,\n \"acc_stderr\": 0.019393058402355435,\n \
\ \"acc_norm\": 0.6421568627450981,\n \"acc_norm_stderr\": 0.019393058402355435\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.029043088683304328,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.029043088683304328\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7562189054726368,\n\
\ \"acc_stderr\": 0.03036049015401464,\n \"acc_norm\": 0.7562189054726368,\n\
\ \"acc_norm_stderr\": 0.03036049015401464\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6815516476213737,\n\
\ \"mc2_stderr\": 0.015177768821414346\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3957543593631539,\n \
\ \"acc_stderr\": 0.01346982370104881\n }\n}\n```"
repo_url: https://huggingface.co/ziniuli/Mistral-7B-ReMax-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|arc:challenge|25_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|gsm8k|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hellaswag|10_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-05-10.060154.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T18-07-55.437908.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- '**/details_harness|winogrande|5_2024-03-09T23-05-10.060154.parquet'
- split: 2024_03_11T18_07_55.437908
path:
- '**/details_harness|winogrande|5_2024-03-11T18-07-55.437908.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T18-07-55.437908.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_05_10.060154
path:
- results_2024-03-09T23-05-10.060154.parquet
- split: 2024_03_11T18_07_55.437908
path:
- results_2024-03-11T18-07-55.437908.parquet
- split: latest
path:
- results_2024-03-11T18-07-55.437908.parquet
---
# Dataset Card for Evaluation run of ziniuli/Mistral-7B-ReMax-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ziniuli/Mistral-7B-ReMax-v0.1](https://huggingface.co/ziniuli/Mistral-7B-ReMax-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T18:07:55.437908](https://huggingface.co/datasets/open-llm-leaderboard/details_ziniuli__Mistral-7B-ReMax-v0.1/blob/main/results_2024-03-11T18-07-55.437908.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6076779028291911,
"acc_stderr": 0.03315115503196382,
"acc_norm": 0.61214149482687,
"acc_norm_stderr": 0.03382378734979988,
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6815516476213737,
"mc2_stderr": 0.015177768821414346
},
"harness|arc:challenge|25": {
"acc": 0.5964163822525598,
"acc_stderr": 0.014337158914268445,
"acc_norm": 0.6331058020477816,
"acc_norm_stderr": 0.014084133118104301
},
"harness|hellaswag|10": {
"acc": 0.6692889862577176,
"acc_stderr": 0.004695076629884538,
"acc_norm": 0.8498307110137423,
"acc_norm_stderr": 0.0035650718701954473
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6118421052631579,
"acc_stderr": 0.03965842097512744,
"acc_norm": 0.6118421052631579,
"acc_norm_stderr": 0.03965842097512744
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6875,
"acc_stderr": 0.038760854559127644,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.038760854559127644
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5953757225433526,
"acc_stderr": 0.03742461193887248,
"acc_norm": 0.5953757225433526,
"acc_norm_stderr": 0.03742461193887248
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5361702127659574,
"acc_stderr": 0.032600385118357715,
"acc_norm": 0.5361702127659574,
"acc_norm_stderr": 0.032600385118357715
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.025010749116137602,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.025010749116137602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.02578772318072387,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.02578772318072387
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.558974358974359,
"acc_stderr": 0.025174048384000745,
"acc_norm": 0.558974358974359,
"acc_norm_stderr": 0.025174048384000745
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.02822644674968351,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.02822644674968351
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059278,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059278
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.038969819642573754,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.038969819642573754
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.017266742087630797,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.017266742087630797
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044812,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044812
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.04726835553719099,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.04726835553719099
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326466,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326466
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.014866821664709583,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.014866821664709583
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348408,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348408
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046626,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046626
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.026003301117885135,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.026003301117885135
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.691358024691358,
"acc_stderr": 0.025702640260603746,
"acc_norm": 0.691358024691358,
"acc_norm_stderr": 0.025702640260603746
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43415906127770537,
"acc_stderr": 0.012659033237067248,
"acc_norm": 0.43415906127770537,
"acc_norm_stderr": 0.012659033237067248
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.02952009569768776,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.02952009569768776
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6421568627450981,
"acc_stderr": 0.019393058402355435,
"acc_norm": 0.6421568627450981,
"acc_norm_stderr": 0.019393058402355435
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.029043088683304328,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.029043088683304328
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7562189054726368,
"acc_stderr": 0.03036049015401464,
"acc_norm": 0.7562189054726368,
"acc_norm_stderr": 0.03036049015401464
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6815516476213737,
"mc2_stderr": 0.015177768821414346
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698338
},
"harness|gsm8k|5": {
"acc": 0.3957543593631539,
"acc_stderr": 0.01346982370104881
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test | open-llm-leaderboard-old | "2024-03-09T23:10:44Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:10:23Z" | ---
pretty_name: Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T23:08:06.310382](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test/blob/main/results_2024-03-09T23-08-06.310382.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6538249490414755,\n\
\ \"acc_stderr\": 0.03205307034724896,\n \"acc_norm\": 0.6534435010472049,\n\
\ \"acc_norm_stderr\": 0.03272118621923929,\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.77524154156829,\n\
\ \"mc2_stderr\": 0.013791360215680813\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7013651877133106,\n \"acc_stderr\": 0.013374078615068745,\n\
\ \"acc_norm\": 0.7312286689419796,\n \"acc_norm_stderr\": 0.012955065963710695\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7155945030870344,\n\
\ \"acc_stderr\": 0.004502088287470137,\n \"acc_norm\": 0.8908583947420833,\n\
\ \"acc_norm_stderr\": 0.003111795320787943\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6444444444444445,\n\
\ \"acc_stderr\": 0.04135176749720385,\n \"acc_norm\": 0.6444444444444445,\n\
\ \"acc_norm_stderr\": 0.04135176749720385\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119669,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119669\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.59,\n \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\"\
: 0.59,\n \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6589595375722543,\n\
\ \"acc_stderr\": 0.03614665424180826,\n \"acc_norm\": 0.6589595375722543,\n\
\ \"acc_norm_stderr\": 0.03614665424180826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108102,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108102\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778394,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778394\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7870967741935484,\n \"acc_stderr\": 0.02328766512726855,\n \"\
acc_norm\": 0.7870967741935484,\n \"acc_norm_stderr\": 0.02328766512726855\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5270935960591133,\n \"acc_stderr\": 0.03512819077876106,\n \"\
acc_norm\": 0.5270935960591133,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768763,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768763\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.39072847682119205,\n \"acc_stderr\": 0.039837983066598075,\n \"\
acc_norm\": 0.39072847682119205,\n \"acc_norm_stderr\": 0.039837983066598075\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.01577623925616323,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.01577623925616323\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752598,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752598\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993464,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993464\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4424581005586592,\n\
\ \"acc_stderr\": 0.016611393687268584,\n \"acc_norm\": 0.4424581005586592,\n\
\ \"acc_norm_stderr\": 0.016611393687268584\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015055,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015055\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6875,\n \"acc_stderr\": 0.02815637344037142,\n \
\ \"acc_norm\": 0.6875,\n \"acc_norm_stderr\": 0.02815637344037142\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507205,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507205\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.87,\n \"acc_stderr\": 0.03379976689896308,\n \
\ \"acc_norm\": 0.87,\n \"acc_norm_stderr\": 0.03379976689896308\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5662650602409639,\n\
\ \"acc_stderr\": 0.03858158940685516,\n \"acc_norm\": 0.5662650602409639,\n\
\ \"acc_norm_stderr\": 0.03858158940685516\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.627906976744186,\n\
\ \"mc1_stderr\": 0.01692109011881403,\n \"mc2\": 0.77524154156829,\n\
\ \"mc2_stderr\": 0.013791360215680813\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8468823993685872,\n \"acc_stderr\": 0.010120623252272956\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6884003032600455,\n \
\ \"acc_stderr\": 0.01275737537675494\n }\n}\n```"
repo_url: https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-08-06.310382.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- '**/details_harness|winogrande|5_2024-03-09T23-08-06.310382.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T23-08-06.310382.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_08_06.310382
path:
- results_2024-03-09T23-08-06.310382.parquet
- split: latest
path:
- results_2024-03-09T23-08-06.310382.parquet
---
# Dataset Card for Evaluation run of eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test](https://huggingface.co/eren23/ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T23:08:06.310382](https://huggingface.co/datasets/open-llm-leaderboard/details_eren23__ogno-monarch-jaskier-merge-7b-OH-PREF-DPO-v4-test/blob/main/results_2024-03-09T23-08-06.310382.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6538249490414755,
"acc_stderr": 0.03205307034724896,
"acc_norm": 0.6534435010472049,
"acc_norm_stderr": 0.03272118621923929,
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.77524154156829,
"mc2_stderr": 0.013791360215680813
},
"harness|arc:challenge|25": {
"acc": 0.7013651877133106,
"acc_stderr": 0.013374078615068745,
"acc_norm": 0.7312286689419796,
"acc_norm_stderr": 0.012955065963710695
},
"harness|hellaswag|10": {
"acc": 0.7155945030870344,
"acc_stderr": 0.004502088287470137,
"acc_norm": 0.8908583947420833,
"acc_norm_stderr": 0.003111795320787943
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6444444444444445,
"acc_stderr": 0.04135176749720385,
"acc_norm": 0.6444444444444445,
"acc_norm_stderr": 0.04135176749720385
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119669,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119669
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.03614665424180826,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.03614665424180826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108102,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108102
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778394,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778394
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7870967741935484,
"acc_stderr": 0.02328766512726855,
"acc_norm": 0.7870967741935484,
"acc_norm_stderr": 0.02328766512726855
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5270935960591133,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.5270935960591133,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768763,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768763
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.39072847682119205,
"acc_stderr": 0.039837983066598075,
"acc_norm": 0.39072847682119205,
"acc_norm_stderr": 0.039837983066598075
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.01577623925616323,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.01577623925616323
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752598,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752598
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993464,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993464
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4424581005586592,
"acc_stderr": 0.016611393687268584,
"acc_norm": 0.4424581005586592,
"acc_norm_stderr": 0.016611393687268584
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015055,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015055
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6875,
"acc_stderr": 0.02815637344037142,
"acc_norm": 0.6875,
"acc_norm_stderr": 0.02815637344037142
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507205,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507205
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.87,
"acc_stderr": 0.03379976689896308,
"acc_norm": 0.87,
"acc_norm_stderr": 0.03379976689896308
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5662650602409639,
"acc_stderr": 0.03858158940685516,
"acc_norm": 0.5662650602409639,
"acc_norm_stderr": 0.03858158940685516
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.627906976744186,
"mc1_stderr": 0.01692109011881403,
"mc2": 0.77524154156829,
"mc2_stderr": 0.013791360215680813
},
"harness|winogrande|5": {
"acc": 0.8468823993685872,
"acc_stderr": 0.010120623252272956
},
"harness|gsm8k|5": {
"acc": 0.6884003032600455,
"acc_stderr": 0.01275737537675494
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_InnerI__InnerILLM-0x00d0-7B-slerp | open-llm-leaderboard-old | "2024-03-09T23:11:27Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:11:06Z" | ---
pretty_name: Evaluation run of InnerI/InnerILLM-0x00d0-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [InnerI/InnerILLM-0x00d0-7B-slerp](https://huggingface.co/InnerI/InnerILLM-0x00d0-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InnerI__InnerILLM-0x00d0-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T23:08:50.068628](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerILLM-0x00d0-7B-slerp/blob/main/results_2024-03-09T23-08-50.068628.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6511782363779303,\n\
\ \"acc_stderr\": 0.03198321204026635,\n \"acc_norm\": 0.6530684920502713,\n\
\ \"acc_norm_stderr\": 0.03262648468991683,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5351031185202262,\n\
\ \"mc2_stderr\": 0.014961733868018287\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6237201365187713,\n \"acc_stderr\": 0.014157022555407163,\n\
\ \"acc_norm\": 0.6578498293515358,\n \"acc_norm_stderr\": 0.013864152159177278\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6571400119498108,\n\
\ \"acc_stderr\": 0.004736950810617791,\n \"acc_norm\": 0.8521210914160526,\n\
\ \"acc_norm_stderr\": 0.0035425443194051424\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6148148148148148,\n\
\ \"acc_stderr\": 0.04203921040156279,\n \"acc_norm\": 0.6148148148148148,\n\
\ \"acc_norm_stderr\": 0.04203921040156279\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6936416184971098,\n\
\ \"acc_stderr\": 0.035149425512674394,\n \"acc_norm\": 0.6936416184971098,\n\
\ \"acc_norm_stderr\": 0.035149425512674394\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6085106382978723,\n \"acc_stderr\": 0.03190701242326812,\n\
\ \"acc_norm\": 0.6085106382978723,\n \"acc_norm_stderr\": 0.03190701242326812\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4074074074074074,\n \"acc_stderr\": 0.02530590624159063,\n \"\
acc_norm\": 0.4074074074074074,\n \"acc_norm_stderr\": 0.02530590624159063\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7903225806451613,\n\
\ \"acc_stderr\": 0.023157879349083522,\n \"acc_norm\": 0.7903225806451613,\n\
\ \"acc_norm_stderr\": 0.023157879349083522\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8808290155440415,\n \"acc_stderr\": 0.023381935348121437,\n\
\ \"acc_norm\": 0.8808290155440415,\n \"acc_norm_stderr\": 0.023381935348121437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6846153846153846,\n \"acc_stderr\": 0.02355964698318994,\n \
\ \"acc_norm\": 0.6846153846153846,\n \"acc_norm_stderr\": 0.02355964698318994\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640763,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640763\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8055555555555556,\n\
\ \"acc_stderr\": 0.038260763248848646,\n \"acc_norm\": 0.8055555555555556,\n\
\ \"acc_norm_stderr\": 0.038260763248848646\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.03226219377286775,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.03226219377286775\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.020930193185179323,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.020930193185179323\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768079\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546835,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546835\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3843575418994413,\n\
\ \"acc_stderr\": 0.0162690886639594,\n \"acc_norm\": 0.3843575418994413,\n\
\ \"acc_norm_stderr\": 0.0162690886639594\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.023993501709042103,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.023993501709042103\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46088657105606257,\n\
\ \"acc_stderr\": 0.012731102790504514,\n \"acc_norm\": 0.46088657105606257,\n\
\ \"acc_norm_stderr\": 0.012731102790504514\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6911764705882353,\n \"acc_stderr\": 0.02806499816704009,\n\
\ \"acc_norm\": 0.6911764705882353,\n \"acc_norm_stderr\": 0.02806499816704009\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8855721393034826,\n\
\ \"acc_stderr\": 0.022509345325101706,\n \"acc_norm\": 0.8855721393034826,\n\
\ \"acc_norm_stderr\": 0.022509345325101706\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.5351031185202262,\n\
\ \"mc2_stderr\": 0.014961733868018287\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8058405682715075,\n \"acc_stderr\": 0.011116983392392657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6141015921152388,\n \
\ \"acc_stderr\": 0.013409077471319168\n }\n}\n```"
repo_url: https://huggingface.co/InnerI/InnerILLM-0x00d0-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-50.068628.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-08-50.068628.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- '**/details_harness|winogrande|5_2024-03-09T23-08-50.068628.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T23-08-50.068628.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_08_50.068628
path:
- results_2024-03-09T23-08-50.068628.parquet
- split: latest
path:
- results_2024-03-09T23-08-50.068628.parquet
---
# Dataset Card for Evaluation run of InnerI/InnerILLM-0x00d0-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InnerI/InnerILLM-0x00d0-7B-slerp](https://huggingface.co/InnerI/InnerILLM-0x00d0-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InnerI__InnerILLM-0x00d0-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T23:08:50.068628](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerILLM-0x00d0-7B-slerp/blob/main/results_2024-03-09T23-08-50.068628.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6511782363779303,
"acc_stderr": 0.03198321204026635,
"acc_norm": 0.6530684920502713,
"acc_norm_stderr": 0.03262648468991683,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5351031185202262,
"mc2_stderr": 0.014961733868018287
},
"harness|arc:challenge|25": {
"acc": 0.6237201365187713,
"acc_stderr": 0.014157022555407163,
"acc_norm": 0.6578498293515358,
"acc_norm_stderr": 0.013864152159177278
},
"harness|hellaswag|10": {
"acc": 0.6571400119498108,
"acc_stderr": 0.004736950810617791,
"acc_norm": 0.8521210914160526,
"acc_norm_stderr": 0.0035425443194051424
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6148148148148148,
"acc_stderr": 0.04203921040156279,
"acc_norm": 0.6148148148148148,
"acc_norm_stderr": 0.04203921040156279
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6936416184971098,
"acc_stderr": 0.035149425512674394,
"acc_norm": 0.6936416184971098,
"acc_norm_stderr": 0.035149425512674394
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6085106382978723,
"acc_stderr": 0.03190701242326812,
"acc_norm": 0.6085106382978723,
"acc_norm_stderr": 0.03190701242326812
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4074074074074074,
"acc_stderr": 0.02530590624159063,
"acc_norm": 0.4074074074074074,
"acc_norm_stderr": 0.02530590624159063
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083522,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083522
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8808290155440415,
"acc_stderr": 0.023381935348121437,
"acc_norm": 0.8808290155440415,
"acc_norm_stderr": 0.023381935348121437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6846153846153846,
"acc_stderr": 0.02355964698318994,
"acc_norm": 0.6846153846153846,
"acc_norm_stderr": 0.02355964698318994
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640763,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640763
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8055555555555556,
"acc_stderr": 0.038260763248848646,
"acc_norm": 0.8055555555555556,
"acc_norm_stderr": 0.038260763248848646
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.03226219377286775,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.03226219377286775
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.020930193185179323,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.020930193185179323
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546835,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546835
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3843575418994413,
"acc_stderr": 0.0162690886639594,
"acc_norm": 0.3843575418994413,
"acc_norm_stderr": 0.0162690886639594
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.023993501709042103,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.023993501709042103
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46088657105606257,
"acc_stderr": 0.012731102790504514,
"acc_norm": 0.46088657105606257,
"acc_norm_stderr": 0.012731102790504514
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6911764705882353,
"acc_stderr": 0.02806499816704009,
"acc_norm": 0.6911764705882353,
"acc_norm_stderr": 0.02806499816704009
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8855721393034826,
"acc_stderr": 0.022509345325101706,
"acc_norm": 0.8855721393034826,
"acc_norm_stderr": 0.022509345325101706
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.5351031185202262,
"mc2_stderr": 0.014961733868018287
},
"harness|winogrande|5": {
"acc": 0.8058405682715075,
"acc_stderr": 0.011116983392392657
},
"harness|gsm8k|5": {
"acc": 0.6141015921152388,
"acc_stderr": 0.013409077471319168
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_ResplendentAI__Sinerva_7B | open-llm-leaderboard-old | "2024-03-09T23:12:51Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:12:32Z" | ---
pretty_name: Evaluation run of ResplendentAI/Sinerva_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ResplendentAI/Sinerva_7B](https://huggingface.co/ResplendentAI/Sinerva_7B) on\
\ the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ResplendentAI__Sinerva_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T23:10:11.458831](https://huggingface.co/datasets/open-llm-leaderboard/details_ResplendentAI__Sinerva_7B/blob/main/results_2024-03-09T23-10-11.458831.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6230401512649942,\n\
\ \"acc_stderr\": 0.032815747491637744,\n \"acc_norm\": 0.6231711257052724,\n\
\ \"acc_norm_stderr\": 0.03349519944326454,\n \"mc1\": 0.4467564259485924,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.5993028471624264,\n\
\ \"mc2_stderr\": 0.015586350003256836\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6697952218430034,\n \"acc_stderr\": 0.013743085603760424,\n\
\ \"acc_norm\": 0.7013651877133106,\n \"acc_norm_stderr\": 0.013374078615068738\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6770563632742481,\n\
\ \"acc_stderr\": 0.004666457279979416,\n \"acc_norm\": 0.8559051981676957,\n\
\ \"acc_norm_stderr\": 0.0035046810917039027\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909282,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909282\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5555555555555556,\n\
\ \"acc_stderr\": 0.04292596718256981,\n \"acc_norm\": 0.5555555555555556,\n\
\ \"acc_norm_stderr\": 0.04292596718256981\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383888,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383888\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.03257901482099835,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.03257901482099835\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.041443118108781526,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.041443118108781526\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3915343915343915,\n \"acc_stderr\": 0.02513809138885111,\n \"\
acc_norm\": 0.3915343915343915,\n \"acc_norm_stderr\": 0.02513809138885111\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.049236596391733084,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.049236596391733084\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239966,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239966\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5221674876847291,\n \"acc_stderr\": 0.03514528562175007,\n\
\ \"acc_norm\": 0.5221674876847291,\n \"acc_norm_stderr\": 0.03514528562175007\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.030088629490217487,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.030088629490217487\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605246,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605246\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3111111111111111,\n \"acc_stderr\": 0.028226446749683505,\n \
\ \"acc_norm\": 0.3111111111111111,\n \"acc_norm_stderr\": 0.028226446749683505\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887044,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887044\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.03861557546255169,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.03861557546255169\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.016197807956848043,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.016197807956848043\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7510548523206751,\n \"acc_stderr\": 0.028146970599422644,\n \
\ \"acc_norm\": 0.7510548523206751,\n \"acc_norm_stderr\": 0.028146970599422644\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7272727272727273,\n \"acc_stderr\": 0.04065578140908705,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.04065578140908705\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.041858325989283136,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.041858325989283136\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n\
\ \"acc_stderr\": 0.014385525076611573,\n \"acc_norm\": 0.7969348659003831,\n\
\ \"acc_norm_stderr\": 0.014385525076611573\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.48044692737430167,\n\
\ \"acc_stderr\": 0.016709709877661995,\n \"acc_norm\": 0.48044692737430167,\n\
\ \"acc_norm_stderr\": 0.016709709877661995\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937624,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937624\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.02540719779889016,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.02540719779889016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.44680851063829785,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44328552803129073,\n\
\ \"acc_stderr\": 0.01268781841959992,\n \"acc_norm\": 0.44328552803129073,\n\
\ \"acc_norm_stderr\": 0.01268781841959992\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6094771241830066,\n \"acc_stderr\": 0.019737008998094597,\n \
\ \"acc_norm\": 0.6094771241830066,\n \"acc_norm_stderr\": 0.019737008998094597\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.689795918367347,\n \"acc_stderr\": 0.029613459872484378,\n\
\ \"acc_norm\": 0.689795918367347,\n \"acc_norm_stderr\": 0.029613459872484378\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421606,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421606\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5060240963855421,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.5060240963855421,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4467564259485924,\n\
\ \"mc1_stderr\": 0.017403977522557144,\n \"mc2\": 0.5993028471624264,\n\
\ \"mc2_stderr\": 0.015586350003256836\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498412\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.623199393479909,\n \
\ \"acc_stderr\": 0.013347858757829154\n }\n}\n```"
repo_url: https://huggingface.co/ResplendentAI/Sinerva_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-10-11.458831.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-10-11.458831.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- '**/details_harness|winogrande|5_2024-03-09T23-10-11.458831.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T23-10-11.458831.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_10_11.458831
path:
- results_2024-03-09T23-10-11.458831.parquet
- split: latest
path:
- results_2024-03-09T23-10-11.458831.parquet
---
# Dataset Card for Evaluation run of ResplendentAI/Sinerva_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ResplendentAI/Sinerva_7B](https://huggingface.co/ResplendentAI/Sinerva_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ResplendentAI__Sinerva_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T23:10:11.458831](https://huggingface.co/datasets/open-llm-leaderboard/details_ResplendentAI__Sinerva_7B/blob/main/results_2024-03-09T23-10-11.458831.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6230401512649942,
"acc_stderr": 0.032815747491637744,
"acc_norm": 0.6231711257052724,
"acc_norm_stderr": 0.03349519944326454,
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.5993028471624264,
"mc2_stderr": 0.015586350003256836
},
"harness|arc:challenge|25": {
"acc": 0.6697952218430034,
"acc_stderr": 0.013743085603760424,
"acc_norm": 0.7013651877133106,
"acc_norm_stderr": 0.013374078615068738
},
"harness|hellaswag|10": {
"acc": 0.6770563632742481,
"acc_stderr": 0.004666457279979416,
"acc_norm": 0.8559051981676957,
"acc_norm_stderr": 0.0035046810917039027
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5555555555555556,
"acc_stderr": 0.04292596718256981,
"acc_norm": 0.5555555555555556,
"acc_norm_stderr": 0.04292596718256981
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383888,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383888
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.03257901482099835,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.03257901482099835
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.041443118108781526,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.041443118108781526
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3915343915343915,
"acc_stderr": 0.02513809138885111,
"acc_norm": 0.3915343915343915,
"acc_norm_stderr": 0.02513809138885111
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.4,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239966,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239966
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5221674876847291,
"acc_stderr": 0.03514528562175007,
"acc_norm": 0.5221674876847291,
"acc_norm_stderr": 0.03514528562175007
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.030088629490217487,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.030088629490217487
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605246,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605246
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3111111111111111,
"acc_stderr": 0.028226446749683505,
"acc_norm": 0.3111111111111111,
"acc_norm_stderr": 0.028226446749683505
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887044,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887044
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.03861557546255169,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.03861557546255169
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.016197807956848043,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.016197807956848043
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7510548523206751,
"acc_stderr": 0.028146970599422644,
"acc_norm": 0.7510548523206751,
"acc_norm_stderr": 0.028146970599422644
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.04065578140908705,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.04065578140908705
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.041858325989283136,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.041858325989283136
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.014385525076611573,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.014385525076611573
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.48044692737430167,
"acc_stderr": 0.016709709877661995,
"acc_norm": 0.48044692737430167,
"acc_norm_stderr": 0.016709709877661995
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937624,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937624
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.02540719779889016,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.02540719779889016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44328552803129073,
"acc_stderr": 0.01268781841959992,
"acc_norm": 0.44328552803129073,
"acc_norm_stderr": 0.01268781841959992
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6094771241830066,
"acc_stderr": 0.019737008998094597,
"acc_norm": 0.6094771241830066,
"acc_norm_stderr": 0.019737008998094597
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.689795918367347,
"acc_stderr": 0.029613459872484378,
"acc_norm": 0.689795918367347,
"acc_norm_stderr": 0.029613459872484378
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421606,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421606
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.82,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5060240963855421,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.5060240963855421,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4467564259485924,
"mc1_stderr": 0.017403977522557144,
"mc2": 0.5993028471624264,
"mc2_stderr": 0.015586350003256836
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498412
},
"harness|gsm8k|5": {
"acc": 0.623199393479909,
"acc_stderr": 0.013347858757829154
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Kabster__Bio-Mistralv2-Squared | open-llm-leaderboard-old | "2024-03-10T05:17:37Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:13:36Z" | ---
pretty_name: Evaluation run of Kabster/Bio-Mistralv2-Squared
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kabster/Bio-Mistralv2-Squared](https://huggingface.co/Kabster/Bio-Mistralv2-Squared)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T05:14:57.667370](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared/blob/main/results_2024-03-10T05-14-57.667370.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5947088232667346,\n\
\ \"acc_stderr\": 0.033001873861023094,\n \"acc_norm\": 0.605373945602307,\n\
\ \"acc_norm_stderr\": 0.033881891438851675,\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.6098919620417469,\n\
\ \"mc2_stderr\": 0.015392589865179624\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5938566552901023,\n \"acc_stderr\": 0.014351656690097862,\n\
\ \"acc_norm\": 0.6296928327645052,\n \"acc_norm_stderr\": 0.01411129875167495\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.657239593706433,\n\
\ \"acc_stderr\": 0.004736621698861176,\n \"acc_norm\": 0.8401712806213901,\n\
\ \"acc_norm_stderr\": 0.0036569821653861666\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395268,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395268\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7132075471698113,\n \"acc_stderr\": 0.027834912527544057,\n\
\ \"acc_norm\": 0.7132075471698113,\n \"acc_norm_stderr\": 0.027834912527544057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.036430371689585475,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.036430371689585475\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \"acc_norm\": 0.71,\n\
\ \"acc_norm_stderr\": 0.04560480215720684\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4978723404255319,\n \"acc_stderr\": 0.03268572658667492,\n\
\ \"acc_norm\": 0.4978723404255319,\n \"acc_norm_stderr\": 0.03268572658667492\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.0250437573185202,\n \"acc_norm\"\
: 0.3835978835978836,\n \"acc_norm_stderr\": 0.0250437573185202\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7129032258064516,\n\
\ \"acc_stderr\": 0.025736542745594528,\n \"acc_norm\": 0.7129032258064516,\n\
\ \"acc_norm_stderr\": 0.025736542745594528\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494562,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494562\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8341968911917098,\n \"acc_stderr\": 0.026839845022314415,\n\
\ \"acc_norm\": 0.8341968911917098,\n \"acc_norm_stderr\": 0.026839845022314415\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5794871794871795,\n \"acc_stderr\": 0.025028610276710862,\n\
\ \"acc_norm\": 0.5794871794871795,\n \"acc_norm_stderr\": 0.025028610276710862\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.034099716973523674,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.034099716973523674\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n\
\ \"acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.70042194092827,\n \"acc_stderr\": 0.02981802474975309,\n \
\ \"acc_norm\": 0.70042194092827,\n \"acc_norm_stderr\": 0.02981802474975309\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419995,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.036412970813137296,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.036412970813137296\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6759259259259259,\n\
\ \"acc_stderr\": 0.045245960070300476,\n \"acc_norm\": 0.6759259259259259,\n\
\ \"acc_norm_stderr\": 0.045245960070300476\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7055214723926381,\n \"acc_stderr\": 0.03581165790474082,\n\
\ \"acc_norm\": 0.7055214723926381,\n \"acc_norm_stderr\": 0.03581165790474082\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8504273504273504,\n\
\ \"acc_stderr\": 0.023365051491753715,\n \"acc_norm\": 0.8504273504273504,\n\
\ \"acc_norm_stderr\": 0.023365051491753715\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7739463601532567,\n\
\ \"acc_stderr\": 0.014957458504335833,\n \"acc_norm\": 0.7739463601532567,\n\
\ \"acc_norm_stderr\": 0.014957458504335833\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6647398843930635,\n \"acc_stderr\": 0.02541600377316554,\n\
\ \"acc_norm\": 0.6647398843930635,\n \"acc_norm_stderr\": 0.02541600377316554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.015060381730018103,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.015060381730018103\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.0264930332251459,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.0264930332251459\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6655948553054662,\n\
\ \"acc_stderr\": 0.026795422327893934,\n \"acc_norm\": 0.6655948553054662,\n\
\ \"acc_norm_stderr\": 0.026795422327893934\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.026725868809100793,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.026725868809100793\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.40425531914893614,\n \"acc_stderr\": 0.029275532159704725,\n \
\ \"acc_norm\": 0.40425531914893614,\n \"acc_norm_stderr\": 0.029275532159704725\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42894393741851367,\n\
\ \"acc_stderr\": 0.012640625443067361,\n \"acc_norm\": 0.42894393741851367,\n\
\ \"acc_norm_stderr\": 0.012640625443067361\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5955882352941176,\n \"acc_stderr\": 0.02981263070156974,\n\
\ \"acc_norm\": 0.5955882352941176,\n \"acc_norm_stderr\": 0.02981263070156974\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5996732026143791,\n \"acc_stderr\": 0.019821843688271758,\n \
\ \"acc_norm\": 0.5996732026143791,\n \"acc_norm_stderr\": 0.019821843688271758\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.02927956741106568,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.02927956741106568\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.029705284056772432,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.029705284056772432\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4879518072289157,\n\
\ \"acc_stderr\": 0.03891364495835821,\n \"acc_norm\": 0.4879518072289157,\n\
\ \"acc_norm_stderr\": 0.03891364495835821\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.42962056303549573,\n\
\ \"mc1_stderr\": 0.0173292345804091,\n \"mc2\": 0.6098919620417469,\n\
\ \"mc2_stderr\": 0.015392589865179624\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7774269928966061,\n \"acc_stderr\": 0.011690933809712666\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.001516300227445034,\n \
\ \"acc_stderr\": 0.0010717793485492668\n }\n}\n```"
repo_url: https://huggingface.co/Kabster/Bio-Mistralv2-Squared
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|arc:challenge|25_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|gsm8k|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hellaswag|10_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-11-18.304357.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T05-14-57.667370.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- '**/details_harness|winogrande|5_2024-03-09T23-11-18.304357.parquet'
- split: 2024_03_10T05_14_57.667370
path:
- '**/details_harness|winogrande|5_2024-03-10T05-14-57.667370.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T05-14-57.667370.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_11_18.304357
path:
- results_2024-03-09T23-11-18.304357.parquet
- split: 2024_03_10T05_14_57.667370
path:
- results_2024-03-10T05-14-57.667370.parquet
- split: latest
path:
- results_2024-03-10T05-14-57.667370.parquet
---
# Dataset Card for Evaluation run of Kabster/Bio-Mistralv2-Squared
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kabster/Bio-Mistralv2-Squared](https://huggingface.co/Kabster/Bio-Mistralv2-Squared) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T05:14:57.667370](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__Bio-Mistralv2-Squared/blob/main/results_2024-03-10T05-14-57.667370.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5947088232667346,
"acc_stderr": 0.033001873861023094,
"acc_norm": 0.605373945602307,
"acc_norm_stderr": 0.033881891438851675,
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.6098919620417469,
"mc2_stderr": 0.015392589865179624
},
"harness|arc:challenge|25": {
"acc": 0.5938566552901023,
"acc_stderr": 0.014351656690097862,
"acc_norm": 0.6296928327645052,
"acc_norm_stderr": 0.01411129875167495
},
"harness|hellaswag|10": {
"acc": 0.657239593706433,
"acc_stderr": 0.004736621698861176,
"acc_norm": 0.8401712806213901,
"acc_norm_stderr": 0.0036569821653861666
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395268,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395268
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7132075471698113,
"acc_stderr": 0.027834912527544057,
"acc_norm": 0.7132075471698113,
"acc_norm_stderr": 0.027834912527544057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4978723404255319,
"acc_stderr": 0.03268572658667492,
"acc_norm": 0.4978723404255319,
"acc_norm_stderr": 0.03268572658667492
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.0250437573185202,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.0250437573185202
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7129032258064516,
"acc_stderr": 0.025736542745594528,
"acc_norm": 0.7129032258064516,
"acc_norm_stderr": 0.025736542745594528
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494562,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494562
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8341968911917098,
"acc_stderr": 0.026839845022314415,
"acc_norm": 0.8341968911917098,
"acc_norm_stderr": 0.026839845022314415
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5794871794871795,
"acc_stderr": 0.025028610276710862,
"acc_norm": 0.5794871794871795,
"acc_norm_stderr": 0.025028610276710862
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5,
"acc_stderr": 0.034099716973523674,
"acc_norm": 0.5,
"acc_norm_stderr": 0.034099716973523674
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.70042194092827,
"acc_stderr": 0.02981802474975309,
"acc_norm": 0.70042194092827,
"acc_norm_stderr": 0.02981802474975309
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419995,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.036412970813137296,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.036412970813137296
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6759259259259259,
"acc_stderr": 0.045245960070300476,
"acc_norm": 0.6759259259259259,
"acc_norm_stderr": 0.045245960070300476
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7055214723926381,
"acc_stderr": 0.03581165790474082,
"acc_norm": 0.7055214723926381,
"acc_norm_stderr": 0.03581165790474082
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8504273504273504,
"acc_stderr": 0.023365051491753715,
"acc_norm": 0.8504273504273504,
"acc_norm_stderr": 0.023365051491753715
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7739463601532567,
"acc_stderr": 0.014957458504335833,
"acc_norm": 0.7739463601532567,
"acc_norm_stderr": 0.014957458504335833
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.02541600377316554,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.02541600377316554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.015060381730018103,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.015060381730018103
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.0264930332251459,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.0264930332251459
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6655948553054662,
"acc_stderr": 0.026795422327893934,
"acc_norm": 0.6655948553054662,
"acc_norm_stderr": 0.026795422327893934
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.026725868809100793,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.026725868809100793
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.40425531914893614,
"acc_stderr": 0.029275532159704725,
"acc_norm": 0.40425531914893614,
"acc_norm_stderr": 0.029275532159704725
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42894393741851367,
"acc_stderr": 0.012640625443067361,
"acc_norm": 0.42894393741851367,
"acc_norm_stderr": 0.012640625443067361
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5955882352941176,
"acc_stderr": 0.02981263070156974,
"acc_norm": 0.5955882352941176,
"acc_norm_stderr": 0.02981263070156974
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5996732026143791,
"acc_stderr": 0.019821843688271758,
"acc_norm": 0.5996732026143791,
"acc_norm_stderr": 0.019821843688271758
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.02927956741106568,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.02927956741106568
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.029705284056772432,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.029705284056772432
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4879518072289157,
"acc_stderr": 0.03891364495835821,
"acc_norm": 0.4879518072289157,
"acc_norm_stderr": 0.03891364495835821
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.42962056303549573,
"mc1_stderr": 0.0173292345804091,
"mc2": 0.6098919620417469,
"mc2_stderr": 0.015392589865179624
},
"harness|winogrande|5": {
"acc": 0.7774269928966061,
"acc_stderr": 0.011690933809712666
},
"harness|gsm8k|5": {
"acc": 0.001516300227445034,
"acc_stderr": 0.0010717793485492668
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Kabster__BioMistral-Zephyr-Beta-SLERP | open-llm-leaderboard-old | "2024-03-09T23:19:56Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:19:31Z" | ---
pretty_name: Evaluation run of Kabster/BioMistral-Zephyr-Beta-SLERP
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Kabster/BioMistral-Zephyr-Beta-SLERP](https://huggingface.co/Kabster/BioMistral-Zephyr-Beta-SLERP)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T23:17:12.005512](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP/blob/main/results_2024-03-09T23-17-12.005512.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5995043159633443,\n\
\ \"acc_stderr\": 0.033015404417283706,\n \"acc_norm\": 0.6105459399539238,\n\
\ \"acc_norm_stderr\": 0.03391082208761106,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190792,\n \"mc2\": 0.5460488636416867,\n\
\ \"mc2_stderr\": 0.015366957850368226\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5844709897610921,\n \"acc_stderr\": 0.014401366641216386,\n\
\ \"acc_norm\": 0.621160409556314,\n \"acc_norm_stderr\": 0.014175915490000326\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6591316470822546,\n\
\ \"acc_stderr\": 0.004730324556624127,\n \"acc_norm\": 0.8412666799442342,\n\
\ \"acc_norm_stderr\": 0.003646803899770339\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.631578947368421,\n \"acc_stderr\": 0.03925523381052932,\n\
\ \"acc_norm\": 0.631578947368421,\n \"acc_norm_stderr\": 0.03925523381052932\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443865,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443865\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.0372424959581773,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.0372424959581773\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5106382978723404,\n \"acc_stderr\": 0.03267862331014063,\n\
\ \"acc_norm\": 0.5106382978723404,\n \"acc_norm_stderr\": 0.03267862331014063\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.04685473041907789,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.04685473041907789\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.37037037037037035,\n \"acc_stderr\": 0.024870815251057093,\n \"\
acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.024870815251057093\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7161290322580646,\n\
\ \"acc_stderr\": 0.02564938106302925,\n \"acc_norm\": 0.7161290322580646,\n\
\ \"acc_norm_stderr\": 0.02564938106302925\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7090909090909091,\n \"acc_stderr\": 0.03546563019624336,\n\
\ \"acc_norm\": 0.7090909090909091,\n \"acc_norm_stderr\": 0.03546563019624336\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6333333333333333,\n \"acc_stderr\": 0.02443301646605247,\n \
\ \"acc_norm\": 0.6333333333333333,\n \"acc_norm_stderr\": 0.02443301646605247\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.030066761582977927,\n\
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.030066761582977927\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7743119266055046,\n \"acc_stderr\": 0.017923087667803064,\n \"\
acc_norm\": 0.7743119266055046,\n \"acc_norm_stderr\": 0.017923087667803064\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703643,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703643\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7177914110429447,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.7177914110429447,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4107142857142857,\n\
\ \"acc_stderr\": 0.046695106638751906,\n \"acc_norm\": 0.4107142857142857,\n\
\ \"acc_norm_stderr\": 0.046695106638751906\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7378640776699029,\n \"acc_stderr\": 0.043546310772605956,\n\
\ \"acc_norm\": 0.7378640776699029,\n \"acc_norm_stderr\": 0.043546310772605956\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8333333333333334,\n\
\ \"acc_stderr\": 0.024414947304543674,\n \"acc_norm\": 0.8333333333333334,\n\
\ \"acc_norm_stderr\": 0.024414947304543674\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7726692209450831,\n\
\ \"acc_stderr\": 0.014987270640946012,\n \"acc_norm\": 0.7726692209450831,\n\
\ \"acc_norm_stderr\": 0.014987270640946012\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6907514450867052,\n \"acc_stderr\": 0.02488314057007176,\n\
\ \"acc_norm\": 0.6907514450867052,\n \"acc_norm_stderr\": 0.02488314057007176\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.28268156424581004,\n\
\ \"acc_stderr\": 0.0150603817300181,\n \"acc_norm\": 0.28268156424581004,\n\
\ \"acc_norm_stderr\": 0.0150603817300181\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6993464052287581,\n \"acc_stderr\": 0.02625605383571896,\n\
\ \"acc_norm\": 0.6993464052287581,\n \"acc_norm_stderr\": 0.02625605383571896\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6820987654320988,\n \"acc_stderr\": 0.02591006352824088,\n\
\ \"acc_norm\": 0.6820987654320988,\n \"acc_norm_stderr\": 0.02591006352824088\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558562,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558562\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.438722294654498,\n\
\ \"acc_stderr\": 0.012673969883493272,\n \"acc_norm\": 0.438722294654498,\n\
\ \"acc_norm_stderr\": 0.012673969883493272\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6160130718954249,\n \"acc_stderr\": 0.019675808135281504,\n \
\ \"acc_norm\": 0.6160130718954249,\n \"acc_norm_stderr\": 0.019675808135281504\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263734,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263734\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7543859649122807,\n \"acc_stderr\": 0.0330140594698725,\n\
\ \"acc_norm\": 0.7543859649122807,\n \"acc_norm_stderr\": 0.0330140594698725\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.01699762787190792,\n \"mc2\": 0.5460488636416867,\n\
\ \"mc2_stderr\": 0.015366957850368226\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7663772691397001,\n \"acc_stderr\": 0.011892194477183524\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Kabster/BioMistral-Zephyr-Beta-SLERP
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-17-12.005512.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- '**/details_harness|winogrande|5_2024-03-09T23-17-12.005512.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T23-17-12.005512.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_17_12.005512
path:
- results_2024-03-09T23-17-12.005512.parquet
- split: latest
path:
- results_2024-03-09T23-17-12.005512.parquet
---
# Dataset Card for Evaluation run of Kabster/BioMistral-Zephyr-Beta-SLERP
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Kabster/BioMistral-Zephyr-Beta-SLERP](https://huggingface.co/Kabster/BioMistral-Zephyr-Beta-SLERP) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T23:17:12.005512](https://huggingface.co/datasets/open-llm-leaderboard/details_Kabster__BioMistral-Zephyr-Beta-SLERP/blob/main/results_2024-03-09T23-17-12.005512.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5995043159633443,
"acc_stderr": 0.033015404417283706,
"acc_norm": 0.6105459399539238,
"acc_norm_stderr": 0.03391082208761106,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190792,
"mc2": 0.5460488636416867,
"mc2_stderr": 0.015366957850368226
},
"harness|arc:challenge|25": {
"acc": 0.5844709897610921,
"acc_stderr": 0.014401366641216386,
"acc_norm": 0.621160409556314,
"acc_norm_stderr": 0.014175915490000326
},
"harness|hellaswag|10": {
"acc": 0.6591316470822546,
"acc_stderr": 0.004730324556624127,
"acc_norm": 0.8412666799442342,
"acc_norm_stderr": 0.003646803899770339
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.631578947368421,
"acc_stderr": 0.03925523381052932,
"acc_norm": 0.631578947368421,
"acc_norm_stderr": 0.03925523381052932
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443865,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443865
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.0372424959581773,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.0372424959581773
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5106382978723404,
"acc_stderr": 0.03267862331014063,
"acc_norm": 0.5106382978723404,
"acc_norm_stderr": 0.03267862331014063
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.04685473041907789,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.04685473041907789
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.024870815251057093,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.024870815251057093
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7161290322580646,
"acc_stderr": 0.02564938106302925,
"acc_norm": 0.7161290322580646,
"acc_norm_stderr": 0.02564938106302925
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.03546563019624336,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.03546563019624336
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6333333333333333,
"acc_stderr": 0.02443301646605247,
"acc_norm": 0.6333333333333333,
"acc_norm_stderr": 0.02443301646605247
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.030066761582977927,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.030066761582977927
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7743119266055046,
"acc_stderr": 0.017923087667803064,
"acc_norm": 0.7743119266055046,
"acc_norm_stderr": 0.017923087667803064
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703643,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703643
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7177914110429447,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.7177914110429447,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4107142857142857,
"acc_stderr": 0.046695106638751906,
"acc_norm": 0.4107142857142857,
"acc_norm_stderr": 0.046695106638751906
},
"harness|hendrycksTest-management|5": {
"acc": 0.7378640776699029,
"acc_stderr": 0.043546310772605956,
"acc_norm": 0.7378640776699029,
"acc_norm_stderr": 0.043546310772605956
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.024414947304543674,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.024414947304543674
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7726692209450831,
"acc_stderr": 0.014987270640946012,
"acc_norm": 0.7726692209450831,
"acc_norm_stderr": 0.014987270640946012
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6907514450867052,
"acc_stderr": 0.02488314057007176,
"acc_norm": 0.6907514450867052,
"acc_norm_stderr": 0.02488314057007176
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.28268156424581004,
"acc_stderr": 0.0150603817300181,
"acc_norm": 0.28268156424581004,
"acc_norm_stderr": 0.0150603817300181
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6993464052287581,
"acc_stderr": 0.02625605383571896,
"acc_norm": 0.6993464052287581,
"acc_norm_stderr": 0.02625605383571896
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6820987654320988,
"acc_stderr": 0.02591006352824088,
"acc_norm": 0.6820987654320988,
"acc_norm_stderr": 0.02591006352824088
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558562,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558562
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.438722294654498,
"acc_stderr": 0.012673969883493272,
"acc_norm": 0.438722294654498,
"acc_norm_stderr": 0.012673969883493272
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6160130718954249,
"acc_stderr": 0.019675808135281504,
"acc_norm": 0.6160130718954249,
"acc_norm_stderr": 0.019675808135281504
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263734,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263734
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7543859649122807,
"acc_stderr": 0.0330140594698725,
"acc_norm": 0.7543859649122807,
"acc_norm_stderr": 0.0330140594698725
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.01699762787190792,
"mc2": 0.5460488636416867,
"mc2_stderr": 0.015366957850368226
},
"harness|winogrande|5": {
"acc": 0.7663772691397001,
"acc_stderr": 0.011892194477183524
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
AcaSp/DomainSpeech | AcaSp | "2024-03-22T08:37:15Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"modality:audio",
"modality:text",
"region:us"
] | null | "2024-03-09T23:23:46Z" | ---
dataset_info:
- config_name: Agriculture_Agricultural Biotechnology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143439038.0
num_examples: 300
download_size: 143297680
dataset_size: 143439038.0
- config_name: Agriculture_Agricultural Economics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138126833.0
num_examples: 300
download_size: 138014919
dataset_size: 138126833.0
- config_name: Agriculture_Agricultural Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143180625.0
num_examples: 300
download_size: 143050446
dataset_size: 143180625.0
- config_name: Agriculture_Agricultural Mechanization
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 154916533.0
num_examples: 300
download_size: 154747365
dataset_size: 154916533.0
- config_name: Agriculture_Animal Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 146354369.0
num_examples: 300
download_size: 146220983
dataset_size: 146354369.0
- config_name: Agriculture_Crop Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143046061.0
num_examples: 300
download_size: 142880656
dataset_size: 143046061.0
- config_name: Agriculture_Entomology and Pesticides
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143552360.0
num_examples: 300
download_size: 143407167
dataset_size: 143552360.0
- config_name: Agriculture_Fisheries
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138944065.0
num_examples: 300
download_size: 138788871
dataset_size: 138944065.0
- config_name: Agriculture_Forestry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 140535848.0
num_examples: 300
download_size: 140392528
dataset_size: 140535848.0
- config_name: Agriculture_Horticulture
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147926282.0
num_examples: 300
download_size: 147791744
dataset_size: 147926282.0
- config_name: Agriculture_Plant Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123700367.0
num_examples: 300
download_size: 123597900
dataset_size: 123700367.0
- config_name: Agriculture_Poultry Production
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147073759.0
num_examples: 300
download_size: 146906099
dataset_size: 147073759.0
- config_name: Agriculture_Soil Sciences and Plant Nutrition
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127354046.0
num_examples: 300
download_size: 127256326
dataset_size: 127354046.0
- config_name: Agriculture_Soil and Water Engineering and Conservation
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134537041.0
num_examples: 300
download_size: 134387592
dataset_size: 134537041.0
- config_name: Arts Design_Arts
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119548638.0
num_examples: 300
download_size: 119440736
dataset_size: 119548638.0
- config_name: Arts Design_Design
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135083325.0
num_examples: 300
download_size: 134936083
dataset_size: 135083325.0
- config_name: Arts Design_Interior Architecture
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141126586.0
num_examples: 300
download_size: 140979090
dataset_size: 141126586.0
- config_name: Arts Design_Urban Planning
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147980852.0
num_examples: 300
download_size: 147794755
dataset_size: 147980852.0
- config_name: Business_Business Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 121104401.0
num_examples: 300
download_size: 120968900
dataset_size: 121104401.0
- config_name: Business_Communications and Media Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123893864.0
num_examples: 300
download_size: 123794867
dataset_size: 123893864.0
- config_name: Business_Decision Science and Operations Management
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 117426723.0
num_examples: 300
download_size: 117317155
dataset_size: 117426723.0
- config_name: Business_Entrepreneurship
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129740439.0
num_examples: 300
download_size: 129590618
dataset_size: 129740439.0
- config_name: Business_Human Resource Management
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134109342.0
num_examples: 300
download_size: 133946610
dataset_size: 134109342.0
- config_name: Business_Marketing
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131082374.0
num_examples: 300
download_size: 130942488
dataset_size: 131082374.0
- config_name: Business_Public Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128436764.0
num_examples: 300
download_size: 128268709
dataset_size: 128436764.0
- config_name: Business_Strategic Management
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129705598.0
num_examples: 300
download_size: 129565676
dataset_size: 129705598.0
- config_name: Economics_Accounting and Finance
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130086798.0
num_examples: 300
download_size: 129970443
dataset_size: 130086798.0
- config_name: Economics_Banking and Insurance
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125576327.0
num_examples: 300
download_size: 125457196
dataset_size: 125576327.0
- config_name: Economics_Environmental Economics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 144396467.0
num_examples: 300
download_size: 144269317
dataset_size: 144396467.0
- config_name: Economics_Financial Economics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126345574.0
num_examples: 300
download_size: 126213407
dataset_size: 126345574.0
- config_name: Economics_International Trade
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129266847.0
num_examples: 300
download_size: 129131077
dataset_size: 129266847.0
- config_name: Education_Early Childhood Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134842546.0
num_examples: 300
download_size: 134669041
dataset_size: 134842546.0
- config_name: Education_Educational Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129139609.0
num_examples: 300
download_size: 129009495
dataset_size: 129139609.0
- config_name: Education_Educational Psychology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132445380.0
num_examples: 300
download_size: 132314227
dataset_size: 132445380.0
- config_name: Education_Educational Technology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136349543.0
num_examples: 300
download_size: 136233919
dataset_size: 136349543.0
- config_name: Education_Elemantary Teacher Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128929721.0
num_examples: 300
download_size: 128832448
dataset_size: 128929721.0
- config_name: Education_Foreign Language Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132729799.0
num_examples: 300
download_size: 132576098
dataset_size: 132729799.0
- config_name: Education_Guidance and Counseling
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137961853.0
num_examples: 300
download_size: 137814518
dataset_size: 137961853.0
- config_name: Education_Mathematics and Science Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134215509.0
num_examples: 300
download_size: 134099723
dataset_size: 134215509.0
- config_name: Education_Physical Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132937777.0
num_examples: 300
download_size: 132805858
dataset_size: 132937777.0
- config_name: Education_Sociology of Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124285485.0
num_examples: 300
download_size: 124176688
dataset_size: 124285485.0
- config_name: Education_Special Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 152289384.0
num_examples: 300
download_size: 152131422
dataset_size: 152289384.0
- config_name: Engineering_Aerospace Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124292138.0
num_examples: 300
download_size: 124191922
dataset_size: 124292138.0
- config_name: Engineering_Automotive Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143846463.0
num_examples: 300
download_size: 143708257
dataset_size: 143846463.0
- config_name: Engineering_Bioengineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143137978.0
num_examples: 300
download_size: 143012457
dataset_size: 143137978.0
- config_name: Engineering_Biomaterials and Tissue Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137146975.0
num_examples: 300
download_size: 137025731
dataset_size: 137146975.0
- config_name: Engineering_Biomedical Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131378195.0
num_examples: 300
download_size: 131261573
dataset_size: 131378195.0
- config_name: Engineering_Chemical Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143133003.0
num_examples: 300
download_size: 143008061
dataset_size: 143133003.0
- config_name: Engineering_Civil Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130465075.0
num_examples: 300
download_size: 130356251
dataset_size: 130465075.0
- config_name: Engineering_Computer Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132679470.0
num_examples: 300
download_size: 132529121
dataset_size: 132679470.0
- config_name: Engineering_Earth Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119846962.0
num_examples: 300
download_size: 119730185
dataset_size: 119846962.0
- config_name: Engineering_Electrical and Electronic Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126520050.0
num_examples: 300
download_size: 126360752
dataset_size: 126520050.0
- config_name: Engineering_Electrical and Information Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123849397.0
num_examples: 300
download_size: 123716265
dataset_size: 123849397.0
- config_name: Engineering_Energy Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137784439.0
num_examples: 300
download_size: 137683801
dataset_size: 137784439.0
- config_name: Engineering_Environmental Science and Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137198399.0
num_examples: 300
download_size: 137059643
dataset_size: 137198399.0
- config_name: Engineering_Food Science and Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133611502.0
num_examples: 300
download_size: 133484623
dataset_size: 133611502.0
- config_name: Engineering_Geomatics Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129068429.0
num_examples: 300
download_size: 128978145
dataset_size: 129068429.0
- config_name: Engineering_Industrial and Manufacturing Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 122429186.0
num_examples: 300
download_size: 122322658
dataset_size: 122429186.0
- config_name: Engineering_Marine Sciences and Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132973282.0
num_examples: 300
download_size: 132860408
dataset_size: 132973282.0
- config_name: Engineering_Mechanical Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135364923.0
num_examples: 300
download_size: 135221594
dataset_size: 135364923.0
- config_name: Engineering_Mechatronics Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126449973.0
num_examples: 300
download_size: 126341559
dataset_size: 126449973.0
- config_name: Engineering_Metallurgical and Materials Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124292613.0
num_examples: 300
download_size: 124165732
dataset_size: 124292613.0
- config_name: Engineering_Meteorology and Atmospheric Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 120671090.0
num_examples: 300
download_size: 120549799
dataset_size: 120671090.0
- config_name: Engineering_Mining Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133000100.0
num_examples: 300
download_size: 132898319
dataset_size: 133000100.0
- config_name: Engineering_Nanoscience and Nanotechnology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126720028.0
num_examples: 300
download_size: 126601451
dataset_size: 126720028.0
- config_name: Engineering_Nuclear Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138378246.0
num_examples: 300
download_size: 138263608
dataset_size: 138378246.0
- config_name: Engineering_Petroleum Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131247557.0
num_examples: 300
download_size: 131121220
dataset_size: 131247557.0
- config_name: Engineering_Textile Engineering
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138330600.0
num_examples: 300
download_size: 138157500
dataset_size: 138330600.0
- config_name: History_History
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130253621.0
num_examples: 300
download_size: 130146337
dataset_size: 130253621.0
- config_name: Law_Business Corporate Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132833176.0
num_examples: 300
download_size: 132657300
dataset_size: 132833176.0
- config_name: Law_Civil Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 120799613.0
num_examples: 300
download_size: 120705948
dataset_size: 120799613.0
- config_name: Law_Constitutional Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124263458.0
num_examples: 300
download_size: 124147786
dataset_size: 124263458.0
- config_name: Law_Criminal Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125936929.0
num_examples: 300
download_size: 125829464
dataset_size: 125936929.0
- config_name: Law_Employment Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132215591.0
num_examples: 300
download_size: 132097839
dataset_size: 132215591.0
- config_name: Law_Environmental Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141112457.0
num_examples: 300
download_size: 140980187
dataset_size: 141112457.0
- config_name: Law_European Union Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134430087.0
num_examples: 300
download_size: 134291260
dataset_size: 134430087.0
- config_name: Law_International Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132972818.0
num_examples: 300
download_size: 132822729
dataset_size: 132972818.0
- config_name: Law_Law and Legal Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124902845.0
num_examples: 300
download_size: 124767772
dataset_size: 124902845.0
- config_name: Law_Public Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119886102.0
num_examples: 300
download_size: 119768166
dataset_size: 119886102.0
- config_name: Law_Tax Law
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126528701.0
num_examples: 300
download_size: 126415023
dataset_size: 126528701.0
- config_name: Medical Sciences_Anatomy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124345096.0
num_examples: 300
download_size: 124253091
dataset_size: 124345096.0
- config_name: Medical Sciences_Anesthesiology and Reanimation
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129149763.0
num_examples: 300
download_size: 129028143
dataset_size: 129149763.0
- config_name: Medical Sciences_Audiology and Speech Pathology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134675137.0
num_examples: 300
download_size: 134564783
dataset_size: 134675137.0
- config_name: Medical Sciences_Bacteriology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129314886.0
num_examples: 300
download_size: 129190011
dataset_size: 129314886.0
- config_name: Medical Sciences_Biochemistry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125011940.0
num_examples: 300
download_size: 124932996
dataset_size: 125011940.0
- config_name: Medical Sciences_Biophysics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126020992.0
num_examples: 300
download_size: 125897336
dataset_size: 126020992.0
- config_name: Medical Sciences_Biostatistics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118651656.0
num_examples: 300
download_size: 118574377
dataset_size: 118651656.0
- config_name: Medical Sciences_Cardiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135302197.0
num_examples: 300
download_size: 135193717
dataset_size: 135302197.0
- config_name: Medical Sciences_Cardiovascular Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137987783.0
num_examples: 300
download_size: 137879610
dataset_size: 137987783.0
- config_name: Medical Sciences_Chest Diseases
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131629091.0
num_examples: 300
download_size: 131486615
dataset_size: 131629091.0
- config_name: Medical Sciences_Child and Adolescent Psychiatry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 152654204.0
num_examples: 300
download_size: 152523834
dataset_size: 152654204.0
- config_name: Medical Sciences_Clinical Pathology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133021566.0
num_examples: 300
download_size: 132912535
dataset_size: 133021566.0
- config_name: Medical Sciences_Dentistry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135479829.0
num_examples: 300
download_size: 135352775
dataset_size: 135479829.0
- config_name: Medical Sciences_Dermatology and Venereology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125724038.0
num_examples: 300
download_size: 125637034
dataset_size: 125724038.0
- config_name: Medical Sciences_Emergency Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135705901.0
num_examples: 300
download_size: 135572579
dataset_size: 135705901.0
- config_name: Medical Sciences_Endocrinology and Metabolism
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136547926.0
num_examples: 300
download_size: 136424174
dataset_size: 136547926.0
- config_name: Medical Sciences_Epidemiology and Public Health
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 122443760.0
num_examples: 300
download_size: 122331509
dataset_size: 122443760.0
- config_name: Medical Sciences_Family Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147162214.0
num_examples: 300
download_size: 147018769
dataset_size: 147162214.0
- config_name: Medical Sciences_Forensic Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135621041.0
num_examples: 300
download_size: 135465069
dataset_size: 135621041.0
- config_name: Medical Sciences_Gastroenterology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137843323.0
num_examples: 300
download_size: 137726037
dataset_size: 137843323.0
- config_name: Medical Sciences_General Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124773122.0
num_examples: 300
download_size: 124665167
dataset_size: 124773122.0
- config_name: Medical Sciences_Geriatrics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 149601165.0
num_examples: 300
download_size: 149441668
dataset_size: 149601165.0
- config_name: Medical Sciences_Health Administration
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137277345.0
num_examples: 300
download_size: 137127990
dataset_size: 137277345.0
- config_name: Medical Sciences_Health Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132340082.0
num_examples: 300
download_size: 132191040
dataset_size: 132340082.0
- config_name: Medical Sciences_Hematology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137161132.0
num_examples: 300
download_size: 137001185
dataset_size: 137161132.0
- config_name: Medical Sciences_Histology and Embriology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118029910.0
num_examples: 300
download_size: 117960878
dataset_size: 118029910.0
- config_name: Medical Sciences_Immunology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 147571192.0
num_examples: 300
download_size: 147439785
dataset_size: 147571192.0
- config_name: Medical Sciences_Infectious Diseases
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130628555.0
num_examples: 300
download_size: 130515362
dataset_size: 130628555.0
- config_name: Medical Sciences_Internal Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132341319.0
num_examples: 300
download_size: 132242597
dataset_size: 132341319.0
- config_name: Medical Sciences_Medical Biochemistry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141321514.0
num_examples: 300
download_size: 141192803
dataset_size: 141321514.0
- config_name: Medical Sciences_Medical Biology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123713781.0
num_examples: 300
download_size: 123626323
dataset_size: 123713781.0
- config_name: Medical Sciences_Medical Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130348018.0
num_examples: 300
download_size: 130247442
dataset_size: 130348018.0
- config_name: Medical Sciences_Medical Genetics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132739285.0
num_examples: 300
download_size: 132620709
dataset_size: 132739285.0
- config_name: Medical Sciences_Medical Microbiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131818843.0
num_examples: 300
download_size: 131710880
dataset_size: 131818843.0
- config_name: Medical Sciences_Medical Oncology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132891133.0
num_examples: 300
download_size: 132742137
dataset_size: 132891133.0
- config_name: Medical Sciences_Medical Parasitology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127638224.0
num_examples: 300
download_size: 127533891
dataset_size: 127638224.0
- config_name: Medical Sciences_Medical Physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128012792.0
num_examples: 300
download_size: 127907099
dataset_size: 128012792.0
- config_name: Medical Sciences_Medical Physiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123009232.0
num_examples: 300
download_size: 122906320
dataset_size: 123009232.0
- config_name: Medical Sciences_Medical Virology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129423629.0
num_examples: 300
download_size: 129321752
dataset_size: 129423629.0
- config_name: Medical Sciences_Microbiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133143959.0
num_examples: 300
download_size: 132988663
dataset_size: 133143959.0
- config_name: Medical Sciences_Molecular Biology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127464967.0
num_examples: 300
download_size: 127337963
dataset_size: 127464967.0
- config_name: Medical Sciences_Mycology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137823673.0
num_examples: 300
download_size: 137708636
dataset_size: 137823673.0
- config_name: Medical Sciences_Neonatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141049258.0
num_examples: 300
download_size: 140933138
dataset_size: 141049258.0
- config_name: Medical Sciences_Nephrology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133628216.0
num_examples: 300
download_size: 133504498
dataset_size: 133628216.0
- config_name: Medical Sciences_Neurology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136508584.0
num_examples: 300
download_size: 136386376
dataset_size: 136508584.0
- config_name: Medical Sciences_Neuroscience
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126214227.0
num_examples: 300
download_size: 126138247
dataset_size: 126214227.0
- config_name: Medical Sciences_Neurosurgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 139598205.0
num_examples: 300
download_size: 139459556
dataset_size: 139598205.0
- config_name: Medical Sciences_Nuclear Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141475957.0
num_examples: 300
download_size: 141349187
dataset_size: 141475957.0
- config_name: Medical Sciences_Nursing and Midwifery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125067849.0
num_examples: 300
download_size: 124961824
dataset_size: 125067849.0
- config_name: Medical Sciences_Nutrition and Dietetics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137298930.0
num_examples: 300
download_size: 137177542
dataset_size: 137298930.0
- config_name: Medical Sciences_Obstetrics and Gynecology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 139462396.0
num_examples: 300
download_size: 139346196
dataset_size: 139462396.0
- config_name: Medical Sciences_Occupational Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 139789686.0
num_examples: 300
download_size: 139663646
dataset_size: 139789686.0
- config_name: Medical Sciences_Ophthalmology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128256576.0
num_examples: 300
download_size: 128137213
dataset_size: 128256576.0
- config_name: Medical Sciences_Optometry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124158526.0
num_examples: 300
download_size: 124043338
dataset_size: 124158526.0
- config_name: Medical Sciences_Orthopedics and Traumatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124954258.0
num_examples: 300
download_size: 124839699
dataset_size: 124954258.0
- config_name: Medical Sciences_Otorhinolaryngology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118568192.0
num_examples: 300
download_size: 118469263
dataset_size: 118568192.0
- config_name: Medical Sciences_Parasitology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128606032.0
num_examples: 300
download_size: 128481740
dataset_size: 128606032.0
- config_name: Medical Sciences_Pathology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136361718.0
num_examples: 300
download_size: 136219475
dataset_size: 136361718.0
- config_name: Medical Sciences_Pediatric Cardiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125106812.0
num_examples: 300
download_size: 125019625
dataset_size: 125106812.0
- config_name: Medical Sciences_Pediatric Endocrinology and Metabolism
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133790952.0
num_examples: 300
download_size: 133675104
dataset_size: 133790952.0
- config_name: Medical Sciences_Pediatric Gastroenterology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129939533.0
num_examples: 300
download_size: 129818254
dataset_size: 129939533.0
- config_name: Medical Sciences_Pediatric Hematology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130557879.0
num_examples: 300
download_size: 130455018
dataset_size: 130557879.0
- config_name: Medical Sciences_Pediatric Immunology and Allergy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124548519.0
num_examples: 300
download_size: 124454909
dataset_size: 124548519.0
- config_name: Medical Sciences_Pediatric Infectious Diseases
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129885463.0
num_examples: 300
download_size: 129772398
dataset_size: 129885463.0
- config_name: Medical Sciences_Pediatric Intensive Care
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136008333.0
num_examples: 300
download_size: 135876113
dataset_size: 136008333.0
- config_name: Medical Sciences_Pediatric Nephrology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133539276.0
num_examples: 300
download_size: 133420904
dataset_size: 133539276.0
- config_name: Medical Sciences_Pediatric Neurology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130006445.0
num_examples: 300
download_size: 129883565
dataset_size: 130006445.0
- config_name: Medical Sciences_Pediatric Pulmonology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131918311.0
num_examples: 300
download_size: 131790321
dataset_size: 131918311.0
- config_name: Medical Sciences_Pediatric Rheumatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141173770.0
num_examples: 300
download_size: 141048082
dataset_size: 141173770.0
- config_name: Medical Sciences_Pediatric Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129573172.0
num_examples: 300
download_size: 129467025
dataset_size: 129573172.0
- config_name: Medical Sciences_Pediatrics and Child Health
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 142513323.0
num_examples: 300
download_size: 142398544
dataset_size: 142513323.0
- config_name: Medical Sciences_Perinatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143238723.0
num_examples: 300
download_size: 143075573
dataset_size: 143238723.0
- config_name: Medical Sciences_Pharmacology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131266646.0
num_examples: 300
download_size: 131140692
dataset_size: 131266646.0
- config_name: Medical Sciences_Pharmacy & Pharmaceutical Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123536721.0
num_examples: 300
download_size: 123432708
dataset_size: 123536721.0
- config_name: Medical Sciences_Physical Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 138883982.0
num_examples: 300
download_size: 138766735
dataset_size: 138883982.0
- config_name: Medical Sciences_Physiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129536853.0
num_examples: 300
download_size: 129405940
dataset_size: 129536853.0
- config_name: Medical Sciences_Physiotherapy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 142691474.0
num_examples: 300
download_size: 142563292
dataset_size: 142691474.0
- config_name: Medical Sciences_Plastic Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 131666178.0
num_examples: 300
download_size: 131555009
dataset_size: 131666178.0
- config_name: Medical Sciences_Podiatry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 130451437.0
num_examples: 300
download_size: 130325455
dataset_size: 130451437.0
- config_name: Medical Sciences_Psychiatry
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 137513120.0
num_examples: 300
download_size: 137383527
dataset_size: 137513120.0
- config_name: Medical Sciences_Radiation Oncology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 146934885.0
num_examples: 300
download_size: 146815433
dataset_size: 146934885.0
- config_name: Medical Sciences_Radiology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 148168300.0
num_examples: 300
download_size: 148016600
dataset_size: 148168300.0
- config_name: Medical Sciences_Rheumatology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134954977.0
num_examples: 300
download_size: 134841511
dataset_size: 134954977.0
- config_name: Medical Sciences_Sport Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127576853.0
num_examples: 300
download_size: 127455316
dataset_size: 127576853.0
- config_name: Medical Sciences_Sports Medicine
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135083531.0
num_examples: 300
download_size: 134931348
dataset_size: 135083531.0
- config_name: Medical Sciences_Thoracic Surgery
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135906719.0
num_examples: 300
download_size: 135778944
dataset_size: 135906719.0
- config_name: Medical Sciences_Urology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135596805.0
num_examples: 300
download_size: 135473770
dataset_size: 135596805.0
- config_name: Medical Sciences_Veterinary Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 135858075.0
num_examples: 300
download_size: 135730165
dataset_size: 135858075.0
- config_name: Medical Sciences_Virology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127937723.0
num_examples: 300
download_size: 127838000
dataset_size: 127937723.0
- config_name: Natural Sciences_Applied physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126350419.0
num_examples: 300
download_size: 126248052
dataset_size: 126350419.0
- config_name: Natural Sciences_Astrophysics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 129300703.0
num_examples: 300
download_size: 129158168
dataset_size: 129300703.0
- config_name: Natural Sciences_Atomic, Molecular and Optical physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 111687416.0
num_examples: 300
download_size: 111582196
dataset_size: 111687416.0
- config_name: Natural Sciences_Biological Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126050128.0
num_examples: 300
download_size: 125945290
dataset_size: 126050128.0
- config_name: Natural Sciences_Chemical Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 125925185.0
num_examples: 300
download_size: 125809833
dataset_size: 125925185.0
- config_name: Natural Sciences_Condensed matter physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 119880781.0
num_examples: 300
download_size: 119762462
dataset_size: 119880781.0
- config_name: Natural Sciences_Geography
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 127678573.0
num_examples: 300
download_size: 127551992
dataset_size: 127678573.0
- config_name: Natural Sciences_Mathematical Sciences
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118078153.0
num_examples: 300
download_size: 117964811
dataset_size: 118078153.0
- config_name: Natural Sciences_Molecular Biology and Genetics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 112294561.0
num_examples: 300
download_size: 112198712
dataset_size: 112294561.0
- config_name: Natural Sciences_Nuclear and Particle Physics
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 121217938.0
num_examples: 300
download_size: 121108176
dataset_size: 121217938.0
- config_name: Philosophy_Philosophy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118345587.0
num_examples: 300
download_size: 118229918
dataset_size: 118345587.0
- config_name: Social Sciences_Anthropology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 128840376.0
num_examples: 300
download_size: 128696216
dataset_size: 128840376.0
- config_name: Social Sciences_Archeology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 118321559.0
num_examples: 300
download_size: 118206487
dataset_size: 118321559.0
- config_name: Social Sciences_Child Development
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 126576147.0
num_examples: 300
download_size: 126464165
dataset_size: 126576147.0
- config_name: Social Sciences_Demography
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 132052357.0
num_examples: 300
download_size: 131901043
dataset_size: 132052357.0
- config_name: Social Sciences_Higher Education Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 141786814.0
num_examples: 300
download_size: 141661233
dataset_size: 141786814.0
- config_name: Social Sciences_Housing
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 146169123.0
num_examples: 300
download_size: 146033728
dataset_size: 146169123.0
- config_name: Social Sciences_International Relations
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133839740.0
num_examples: 300
download_size: 133676984
dataset_size: 133839740.0
- config_name: Social Sciences_Library and Information Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 123726092.0
num_examples: 300
download_size: 123594991
dataset_size: 123726092.0
- config_name: Social Sciences_Linguistics and Literature
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 114704654.0
num_examples: 300
download_size: 114595695
dataset_size: 114704654.0
- config_name: Social Sciences_Open and Distance Education
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 143105156.0
num_examples: 300
download_size: 142956652
dataset_size: 143105156.0
- config_name: Social Sciences_Political Science
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 121094624.0
num_examples: 300
download_size: 120963345
dataset_size: 121094624.0
- config_name: Social Sciences_Psychology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 136275316.0
num_examples: 300
download_size: 136139111
dataset_size: 136275316.0
- config_name: Social Sciences_Regional Studies
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 124353132.0
num_examples: 300
download_size: 124243486
dataset_size: 124353132.0
- config_name: Social Sciences_Social Policy
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134904666.0
num_examples: 300
download_size: 134753980
dataset_size: 134904666.0
- config_name: Social Sciences_Social Work
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 134077053.0
num_examples: 300
download_size: 133967130
dataset_size: 134077053.0
- config_name: Social Sciences_Sociology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 133329557.0
num_examples: 300
download_size: 133180184
dataset_size: 133329557.0
- config_name: Social Sciences_Tourism and Hospitality
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 142262231.0
num_examples: 300
download_size: 142100591
dataset_size: 142262231.0
- config_name: Social Sciences_Transportation Science and Technology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 140265612.0
num_examples: 300
download_size: 140124964
dataset_size: 140265612.0
- config_name: Theology_Theology
features:
- name: audio
dtype: audio
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 115449408.0
num_examples: 300
download_size: 115356333
dataset_size: 115449408.0
- config_name: testing
features:
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: sentence
dtype: string
splits:
- name: test
num_bytes: 115449370.0
num_examples: 300
download_size: 115356390
dataset_size: 115449370.0
configs:
- config_name: Agriculture_Agricultural Biotechnology
data_files:
- split: test
path: content/Agriculture/Agricultural Biotechnology/test-*
- config_name: Agriculture_Agricultural Economics
data_files:
- split: test
path: content/Agriculture/Agricultural Economics/test-*
- config_name: Agriculture_Agricultural Engineering
data_files:
- split: test
path: content/Agriculture/Agricultural Engineering/test-*
- config_name: Agriculture_Agricultural Mechanization
data_files:
- split: test
path: content/Agriculture/Agricultural Mechanization/test-*
- config_name: Agriculture_Animal Science
data_files:
- split: test
path: content/Agriculture/Animal Science/test-*
- config_name: Agriculture_Crop Science
data_files:
- split: test
path: content/Agriculture/Crop Science/test-*
- config_name: Agriculture_Entomology and Pesticides
data_files:
- split: test
path: content/Agriculture/Entomology and Pesticides/test-*
- config_name: Agriculture_Fisheries
data_files:
- split: test
path: content/Agriculture/Fisheries/test-*
- config_name: Agriculture_Forestry
data_files:
- split: test
path: content/Agriculture/Forestry/test-*
- config_name: Agriculture_Horticulture
data_files:
- split: test
path: content/Agriculture/Horticulture/test-*
- config_name: Agriculture_Plant Science
data_files:
- split: test
path: content/Agriculture/Plant Science/test-*
- config_name: Agriculture_Poultry Production
data_files:
- split: test
path: content/Agriculture/Poultry Production/test-*
- config_name: Agriculture_Soil Sciences and Plant Nutrition
data_files:
- split: test
path: content/Agriculture/Soil Sciences and Plant Nutrition/test-*
- config_name: Agriculture_Soil and Water Engineering and Conservation
data_files:
- split: test
path: content/Agriculture/Soil and Water Engineering and Conservation/test-*
- config_name: Arts Design_Arts
data_files:
- split: test
path: content/Arts Design/Arts/test-*
- config_name: Arts Design_Design
data_files:
- split: test
path: content/Arts Design/Design/test-*
- config_name: Arts Design_Interior Architecture
data_files:
- split: test
path: content/Arts Design/Interior Architecture/test-*
- config_name: Arts Design_Urban Planning
data_files:
- split: test
path: content/Arts Design/Urban Planning/test-*
- config_name: Business_Business Administration
data_files:
- split: test
path: content/Business/Business Administration/test-*
- config_name: Business_Communications and Media Studies
data_files:
- split: test
path: content/Business/Communications and Media Studies/test-*
- config_name: Business_Decision Science and Operations Management
data_files:
- split: test
path: content/Business/Decision Science and Operations Management/test-*
- config_name: Business_Entrepreneurship
data_files:
- split: test
path: content/Business/Entrepreneurship/test-*
- config_name: Business_Human Resource Management
data_files:
- split: test
path: content/Business/Human Resource Management/test-*
- config_name: Business_Marketing
data_files:
- split: test
path: content/Business/Marketing/test-*
- config_name: Business_Public Administration
data_files:
- split: test
path: content/Business/Public Administration/test-*
- config_name: Business_Strategic Management
data_files:
- split: test
path: content/Business/Strategic Management/test-*
- config_name: Economics_Accounting and Finance
data_files:
- split: test
path: content/Economics/Accounting and Finance/test-*
- config_name: Economics_Banking and Insurance
data_files:
- split: test
path: content/Economics/Banking and Insurance/test-*
- config_name: Economics_Environmental Economics
data_files:
- split: test
path: content/Economics/Environmental Economics/test-*
- config_name: Economics_Financial Economics
data_files:
- split: test
path: content/Economics/Financial Economics/test-*
- config_name: Economics_International Trade
data_files:
- split: test
path: content/Economics/International Trade/test-*
- config_name: Education_Early Childhood Education
data_files:
- split: test
path: content/Education/Early Childhood Education/test-*
- config_name: Education_Educational Administration
data_files:
- split: test
path: content/Education/Educational Administration/test-*
- config_name: Education_Educational Psychology
data_files:
- split: test
path: content/Education/Educational Psychology/test-*
- config_name: Education_Educational Technology
data_files:
- split: test
path: content/Education/Educational Technology/test-*
- config_name: Education_Elemantary Teacher Education
data_files:
- split: test
path: content/Education/Elemantary Teacher Education/test-*
- config_name: Education_Foreign Language Education
data_files:
- split: test
path: content/Education/Foreign Language Education/test-*
- config_name: Education_Guidance and Counseling
data_files:
- split: test
path: content/Education/Guidance and Counseling/test-*
- config_name: Education_Mathematics and Science Education
data_files:
- split: test
path: content/Education/Mathematics and Science Education/test-*
- config_name: Education_Physical Education
data_files:
- split: test
path: content/Education/Physical Education/test-*
- config_name: Education_Sociology of Education
data_files:
- split: test
path: content/Education/Sociology of Education/test-*
- config_name: Education_Special Education
data_files:
- split: test
path: content/Education/Special Education/test-*
- config_name: Engineering_Aerospace Engineering
data_files:
- split: test
path: content/Engineering/Aerospace Engineering/test-*
- config_name: Engineering_Automotive Engineering
data_files:
- split: test
path: content/Engineering/Automotive Engineering/test-*
- config_name: Engineering_Bioengineering
data_files:
- split: test
path: content/Engineering/Bioengineering/test-*
- config_name: Engineering_Biomaterials and Tissue Engineering
data_files:
- split: test
path: content/Engineering/Biomaterials and Tissue Engineering/test-*
- config_name: Engineering_Biomedical Engineering
data_files:
- split: test
path: content/Engineering/Biomedical Engineering/test-*
- config_name: Engineering_Chemical Engineering
data_files:
- split: test
path: content/Engineering/Chemical Engineering/test-*
- config_name: Engineering_Civil Engineering
data_files:
- split: test
path: content/Engineering/Civil Engineering/test-*
- config_name: Engineering_Computer Science
data_files:
- split: test
path: content/Engineering/Computer Science/test-*
- config_name: Engineering_Earth Sciences
data_files:
- split: test
path: content/Engineering/Earth Sciences/test-*
- config_name: Engineering_Electrical and Electronic Engineering
data_files:
- split: test
path: content/Engineering/Electrical and Electronic Engineering/test-*
- config_name: Engineering_Electrical and Information Engineering
data_files:
- split: test
path: content/Engineering/Electrical and Information Engineering/test-*
- config_name: Engineering_Energy Engineering
data_files:
- split: test
path: content/Engineering/Energy Engineering/test-*
- config_name: Engineering_Environmental Science and Engineering
data_files:
- split: test
path: content/Engineering/Environmental Science and Engineering/test-*
- config_name: Engineering_Food Science and Engineering
data_files:
- split: test
path: content/Engineering/Food Science and Engineering/test-*
- config_name: Engineering_Geomatics Engineering
data_files:
- split: test
path: content/Engineering/Geomatics Engineering/test-*
- config_name: Engineering_Industrial and Manufacturing Engineering
data_files:
- split: test
path: content/Engineering/Industrial and Manufacturing Engineering/test-*
- config_name: Engineering_Marine Sciences and Engineering
data_files:
- split: test
path: content/Engineering/Marine Sciences and Engineering/test-*
- config_name: Engineering_Mechanical Engineering
data_files:
- split: test
path: content/Engineering/Mechanical Engineering/test-*
- config_name: Engineering_Mechatronics Engineering
data_files:
- split: test
path: content/Engineering/Mechatronics Engineering/test-*
- config_name: Engineering_Metallurgical and Materials Engineering
data_files:
- split: test
path: content/Engineering/Metallurgical and Materials Engineering/test-*
- config_name: Engineering_Meteorology and Atmospheric Sciences
data_files:
- split: test
path: content/Engineering/Meteorology and Atmospheric Sciences/test-*
- config_name: Engineering_Mining Engineering
data_files:
- split: test
path: content/Engineering/Mining Engineering/test-*
- config_name: Engineering_Nanoscience and Nanotechnology
data_files:
- split: test
path: content/Engineering/Nanoscience and Nanotechnology/test-*
- config_name: Engineering_Nuclear Engineering
data_files:
- split: test
path: content/Engineering/Nuclear Engineering/test-*
- config_name: Engineering_Petroleum Engineering
data_files:
- split: test
path: content/Engineering/Petroleum Engineering/test-*
- config_name: Engineering_Textile Engineering
data_files:
- split: test
path: content/Engineering/Textile Engineering/test-*
- config_name: History_History
data_files:
- split: test
path: content/History/History/test-*
- config_name: Law_Business Corporate Law
data_files:
- split: test
path: content/Law/Business Corporate Law/test-*
- config_name: Law_Civil Law
data_files:
- split: test
path: content/Law/Civil Law/test-*
- config_name: Law_Constitutional Law
data_files:
- split: test
path: content/Law/Constitutional Law/test-*
- config_name: Law_Criminal Law
data_files:
- split: test
path: content/Law/Criminal Law/test-*
- config_name: Law_Employment Law
data_files:
- split: test
path: content/Law/Employment Law/test-*
- config_name: Law_Environmental Law
data_files:
- split: test
path: content/Law/Environmental Law/test-*
- config_name: Law_European Union Law
data_files:
- split: test
path: content/Law/European Union Law/test-*
- config_name: Law_International Law
data_files:
- split: test
path: content/Law/International Law/test-*
- config_name: Law_Law and Legal Studies
data_files:
- split: test
path: content/Law/Law and Legal Studies/test-*
- config_name: Law_Public Law
data_files:
- split: test
path: content/Law/Public Law/test-*
- config_name: Law_Tax Law
data_files:
- split: test
path: content/Law/Tax Law/test-*
- config_name: Medical Sciences_Anatomy
data_files:
- split: test
path: content/Medical Sciences/Anatomy/test-*
- config_name: Medical Sciences_Anesthesiology and Reanimation
data_files:
- split: test
path: content/Medical Sciences/Anesthesiology and Reanimation/test-*
- config_name: Medical Sciences_Audiology and Speech Pathology
data_files:
- split: test
path: content/Medical Sciences/Audiology and Speech Pathology/test-*
- config_name: Medical Sciences_Bacteriology
data_files:
- split: test
path: content/Medical Sciences/Bacteriology/test-*
- config_name: Medical Sciences_Biochemistry
data_files:
- split: test
path: content/Medical Sciences/Biochemistry/test-*
- config_name: Medical Sciences_Biophysics
data_files:
- split: test
path: content/Medical Sciences/Biophysics/test-*
- config_name: Medical Sciences_Biostatistics
data_files:
- split: test
path: content/Medical Sciences/Biostatistics/test-*
- config_name: Medical Sciences_Cardiology
data_files:
- split: test
path: content/Medical Sciences/Cardiology/test-*
- config_name: Medical Sciences_Cardiovascular Surgery
data_files:
- split: test
path: content/Medical Sciences/Cardiovascular Surgery/test-*
- config_name: Medical Sciences_Chest Diseases
data_files:
- split: test
path: content/Medical Sciences/Chest Diseases/test-*
- config_name: Medical Sciences_Child and Adolescent Psychiatry
data_files:
- split: test
path: content/Medical Sciences/Child and Adolescent Psychiatry/test-*
- config_name: Medical Sciences_Clinical Pathology
data_files:
- split: test
path: content/Medical Sciences/Clinical Pathology/test-*
- config_name: Medical Sciences_Dentistry
data_files:
- split: test
path: content/Medical Sciences/Dentistry/test-*
- config_name: Medical Sciences_Dermatology and Venereology
data_files:
- split: test
path: content/Medical Sciences/Dermatology and Venereology/test-*
- config_name: Medical Sciences_Emergency Medicine
data_files:
- split: test
path: content/Medical Sciences/Emergency Medicine/test-*
- config_name: Medical Sciences_Endocrinology and Metabolism
data_files:
- split: test
path: content/Medical Sciences/Endocrinology and Metabolism/test-*
- config_name: Medical Sciences_Epidemiology and Public Health
data_files:
- split: test
path: content/Medical Sciences/Epidemiology and Public Health/test-*
- config_name: Medical Sciences_Family Medicine
data_files:
- split: test
path: content/Medical Sciences/Family Medicine/test-*
- config_name: Medical Sciences_Forensic Medicine
data_files:
- split: test
path: content/Medical Sciences/Forensic Medicine/test-*
- config_name: Medical Sciences_Gastroenterology
data_files:
- split: test
path: content/Medical Sciences/Gastroenterology/test-*
- config_name: Medical Sciences_General Surgery
data_files:
- split: test
path: content/Medical Sciences/General Surgery/test-*
- config_name: Medical Sciences_Geriatrics
data_files:
- split: test
path: content/Medical Sciences/Geriatrics/test-*
- config_name: Medical Sciences_Health Administration
data_files:
- split: test
path: content/Medical Sciences/Health Administration/test-*
- config_name: Medical Sciences_Health Sciences
data_files:
- split: test
path: content/Medical Sciences/Health Sciences/test-*
- config_name: Medical Sciences_Hematology
data_files:
- split: test
path: content/Medical Sciences/Hematology/test-*
- config_name: Medical Sciences_Histology and Embriology
data_files:
- split: test
path: content/Medical Sciences/Histology and Embriology/test-*
- config_name: Medical Sciences_Immunology
data_files:
- split: test
path: content/Medical Sciences/Immunology/test-*
- config_name: Medical Sciences_Infectious Diseases
data_files:
- split: test
path: content/Medical Sciences/Infectious Diseases/test-*
- config_name: Medical Sciences_Internal Medicine
data_files:
- split: test
path: content/Medical Sciences/Internal Medicine/test-*
- config_name: Medical Sciences_Medical Biochemistry
data_files:
- split: test
path: content/Medical Sciences/Medical Biochemistry/test-*
- config_name: Medical Sciences_Medical Biology
data_files:
- split: test
path: content/Medical Sciences/Medical Biology/test-*
- config_name: Medical Sciences_Medical Education
data_files:
- split: test
path: content/Medical Sciences/Medical Education/test-*
- config_name: Medical Sciences_Medical Genetics
data_files:
- split: test
path: content/Medical Sciences/Medical Genetics/test-*
- config_name: Medical Sciences_Medical Microbiology
data_files:
- split: test
path: content/Medical Sciences/Medical Microbiology/test-*
- config_name: Medical Sciences_Medical Oncology
data_files:
- split: test
path: content/Medical Sciences/Medical Oncology/test-*
- config_name: Medical Sciences_Medical Parasitology
data_files:
- split: test
path: content/Medical Sciences/Medical Parasitology/test-*
- config_name: Medical Sciences_Medical Physics
data_files:
- split: test
path: content/Medical Sciences/Medical Physics/test-*
- config_name: Medical Sciences_Medical Physiology
data_files:
- split: test
path: content/Medical Sciences/Medical Physiology/test-*
- config_name: Medical Sciences_Medical Virology
data_files:
- split: test
path: content/Medical Sciences/Medical Virology/test-*
- config_name: Medical Sciences_Microbiology
data_files:
- split: test
path: content/Medical Sciences/Microbiology/test-*
- config_name: Medical Sciences_Molecular Biology
data_files:
- split: test
path: content/Medical Sciences/Molecular Biology/test-*
- config_name: Medical Sciences_Mycology
data_files:
- split: test
path: content/Medical Sciences/Mycology/test-*
- config_name: Medical Sciences_Neonatology
data_files:
- split: test
path: content/Medical Sciences/Neonatology/test-*
- config_name: Medical Sciences_Nephrology
data_files:
- split: test
path: content/Medical Sciences/Nephrology/test-*
- config_name: Medical Sciences_Neurology
data_files:
- split: test
path: content/Medical Sciences/Neurology/test-*
- config_name: Medical Sciences_Neuroscience
data_files:
- split: test
path: content/Medical Sciences/Neuroscience/test-*
- config_name: Medical Sciences_Neurosurgery
data_files:
- split: test
path: content/Medical Sciences/Neurosurgery/test-*
- config_name: Medical Sciences_Nuclear Medicine
data_files:
- split: test
path: content/Medical Sciences/Nuclear Medicine/test-*
- config_name: Medical Sciences_Nursing and Midwifery
data_files:
- split: test
path: content/Medical Sciences/Nursing and Midwifery/test-*
- config_name: Medical Sciences_Nutrition and Dietetics
data_files:
- split: test
path: content/Medical Sciences/Nutrition and Dietetics/test-*
- config_name: Medical Sciences_Obstetrics and Gynecology
data_files:
- split: test
path: content/Medical Sciences/Obstetrics and Gynecology/test-*
- config_name: Medical Sciences_Occupational Medicine
data_files:
- split: test
path: content/Medical Sciences/Occupational Medicine/test-*
- config_name: Medical Sciences_Ophthalmology
data_files:
- split: test
path: content/Medical Sciences/Ophthalmology/test-*
- config_name: Medical Sciences_Optometry
data_files:
- split: test
path: content/Medical Sciences/Optometry/test-*
- config_name: Medical Sciences_Orthopedics and Traumatology
data_files:
- split: test
path: content/Medical Sciences/Orthopedics and Traumatology/test-*
- config_name: Medical Sciences_Otorhinolaryngology
data_files:
- split: test
path: content/Medical Sciences/Otorhinolaryngology/test-*
- config_name: Medical Sciences_Parasitology
data_files:
- split: test
path: content/Medical Sciences/Parasitology/test-*
- config_name: Medical Sciences_Pathology
data_files:
- split: test
path: content/Medical Sciences/Pathology/test-*
- config_name: Medical Sciences_Pediatric Cardiology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Cardiology/test-*
- config_name: Medical Sciences_Pediatric Endocrinology and Metabolism
data_files:
- split: test
path: content/Medical Sciences/Pediatric Endocrinology and Metabolism/test-*
- config_name: Medical Sciences_Pediatric Gastroenterology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Gastroenterology/test-*
- config_name: Medical Sciences_Pediatric Hematology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Hematology/test-*
- config_name: Medical Sciences_Pediatric Immunology and Allergy
data_files:
- split: test
path: content/Medical Sciences/Pediatric Immunology and Allergy/test-*
- config_name: Medical Sciences_Pediatric Infectious Diseases
data_files:
- split: test
path: content/Medical Sciences/Pediatric Infectious Diseases/test-*
- config_name: Medical Sciences_Pediatric Intensive Care
data_files:
- split: test
path: content/Medical Sciences/Pediatric Intensive Care/test-*
- config_name: Medical Sciences_Pediatric Nephrology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Nephrology/test-*
- config_name: Medical Sciences_Pediatric Neurology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Neurology/test-*
- config_name: Medical Sciences_Pediatric Pulmonology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Pulmonology/test-*
- config_name: Medical Sciences_Pediatric Rheumatology
data_files:
- split: test
path: content/Medical Sciences/Pediatric Rheumatology/test-*
- config_name: Medical Sciences_Pediatric Surgery
data_files:
- split: test
path: content/Medical Sciences/Pediatric Surgery/test-*
- config_name: Medical Sciences_Pediatrics and Child Health
data_files:
- split: test
path: content/Medical Sciences/Pediatrics and Child Health/test-*
- config_name: Medical Sciences_Perinatology
data_files:
- split: test
path: content/Medical Sciences/Perinatology/test-*
- config_name: Medical Sciences_Pharmacology
data_files:
- split: test
path: content/Medical Sciences/Pharmacology/test-*
- config_name: Medical Sciences_Pharmacy & Pharmaceutical Sciences
data_files:
- split: test
path: content/Medical Sciences/Pharmacy & Pharmaceutical Sciences/test-*
- config_name: Medical Sciences_Physical Medicine
data_files:
- split: test
path: content/Medical Sciences/Physical Medicine/test-*
- config_name: Medical Sciences_Physiology
data_files:
- split: test
path: content/Medical Sciences/Physiology/test-*
- config_name: Medical Sciences_Physiotherapy
data_files:
- split: test
path: content/Medical Sciences/Physiotherapy/test-*
- config_name: Medical Sciences_Plastic Surgery
data_files:
- split: test
path: content/Medical Sciences/Plastic Surgery/test-*
- config_name: Medical Sciences_Podiatry
data_files:
- split: test
path: content/Medical Sciences/Podiatry/test-*
- config_name: Medical Sciences_Psychiatry
data_files:
- split: test
path: content/Medical Sciences/Psychiatry/test-*
- config_name: Medical Sciences_Radiation Oncology
data_files:
- split: test
path: content/Medical Sciences/Radiation Oncology/test-*
- config_name: Medical Sciences_Radiology
data_files:
- split: test
path: content/Medical Sciences/Radiology/test-*
- config_name: Medical Sciences_Rheumatology
data_files:
- split: test
path: content/Medical Sciences/Rheumatology/test-*
- config_name: Medical Sciences_Sport Science
data_files:
- split: test
path: content/Medical Sciences/Sport Science/test-*
- config_name: Medical Sciences_Sports Medicine
data_files:
- split: test
path: content/Medical Sciences/Sports Medicine/test-*
- config_name: Medical Sciences_Thoracic Surgery
data_files:
- split: test
path: content/Medical Sciences/Thoracic Surgery/test-*
- config_name: Medical Sciences_Urology
data_files:
- split: test
path: content/Medical Sciences/Urology/test-*
- config_name: Medical Sciences_Veterinary Sciences
data_files:
- split: test
path: content/Medical Sciences/Veterinary Sciences/test-*
- config_name: Medical Sciences_Virology
data_files:
- split: test
path: content/Medical Sciences/Virology/test-*
- config_name: Natural Sciences_Applied physics
data_files:
- split: test
path: content/Natural Sciences/Applied physics/test-*
- config_name: Natural Sciences_Astrophysics
data_files:
- split: test
path: content/Natural Sciences/Astrophysics/test-*
- config_name: Natural Sciences_Atomic, Molecular and Optical physics
data_files:
- split: test
path: content/Natural Sciences/Atomic, Molecular and Optical physics/test-*
- config_name: Natural Sciences_Biological Science
data_files:
- split: test
path: content/Natural Sciences/Biological Science/test-*
- config_name: Natural Sciences_Chemical Sciences
data_files:
- split: test
path: content/Natural Sciences/Chemical Sciences/test-*
- config_name: Natural Sciences_Condensed matter physics
data_files:
- split: test
path: content/Natural Sciences/Condensed matter physics/test-*
- config_name: Natural Sciences_Geography
data_files:
- split: test
path: content/Natural Sciences/Geography/test-*
- config_name: Natural Sciences_Mathematical Sciences
data_files:
- split: test
path: content/Natural Sciences/Mathematical Sciences/test-*
- config_name: Natural Sciences_Molecular Biology and Genetics
data_files:
- split: test
path: content/Natural Sciences/Molecular Biology and Genetics/test-*
- config_name: Natural Sciences_Nuclear and Particle Physics
data_files:
- split: test
path: content/Natural Sciences/Nuclear and Particle Physics/test-*
- config_name: Philosophy_Philosophy
data_files:
- split: test
path: content/Philosophy/Philosophy/test-*
- config_name: Social Sciences_Anthropology
data_files:
- split: test
path: content/Social Sciences/Anthropology/test-*
- config_name: Social Sciences_Archeology
data_files:
- split: test
path: content/Social Sciences/Archeology/test-*
- config_name: Social Sciences_Child Development
data_files:
- split: test
path: content/Social Sciences/Child Development/test-*
- config_name: Social Sciences_Demography
data_files:
- split: test
path: content/Social Sciences/Demography/test-*
- config_name: Social Sciences_Higher Education Studies
data_files:
- split: test
path: content/Social Sciences/Higher Education Studies/test-*
- config_name: Social Sciences_Housing
data_files:
- split: test
path: content/Social Sciences/Housing/test-*
- config_name: Social Sciences_International Relations
data_files:
- split: test
path: content/Social Sciences/International Relations/test-*
- config_name: Social Sciences_Library and Information Science
data_files:
- split: test
path: content/Social Sciences/Library and Information Science/test-*
- config_name: Social Sciences_Linguistics and Literature
data_files:
- split: test
path: content/Social Sciences/Linguistics and Literature/test-*
- config_name: Social Sciences_Open and Distance Education
data_files:
- split: test
path: content/Social Sciences/Open and Distance Education/test-*
- config_name: Social Sciences_Political Science
data_files:
- split: test
path: content/Social Sciences/Political Science/test-*
- config_name: Social Sciences_Psychology
data_files:
- split: test
path: content/Social Sciences/Psychology/test-*
- config_name: Social Sciences_Regional Studies
data_files:
- split: test
path: content/Social Sciences/Regional Studies/test-*
- config_name: Social Sciences_Social Policy
data_files:
- split: test
path: content/Social Sciences/Social Policy/test-*
- config_name: Social Sciences_Social Work
data_files:
- split: test
path: content/Social Sciences/Social Work/test-*
- config_name: Social Sciences_Sociology
data_files:
- split: test
path: content/Social Sciences/Sociology/test-*
- config_name: Social Sciences_Tourism and Hospitality
data_files:
- split: test
path: content/Social Sciences/Tourism and Hospitality/test-*
- config_name: Social Sciences_Transportation Science and Technology
data_files:
- split: test
path: content/Social Sciences/Transportation Science and Technology/test-*
- config_name: Theology_Theology
data_files:
- split: test
path: content/Theology/Theology/test-*
- config_name: testing
data_files:
- split: test
path: /content/testing/test-*
---
# Multi-domain academic audio data for evaluating ASR model
## Dataset Summary
This dataset, named "DomainSpeech," is meticulously curated to serve as a robust evaluation tool for Automatic Speech Recognition (ASR) models. Encompassing a broad spectrum of academic domains including Agriculture, Sciences, Engineering, and Business. A distinctive feature of this dataset is its deliberate design to present a more challenging benchmark by maintaining a technical terminology density of 20% across the texts. This parameter was set to elevate the complexity above the norm found in existing ASR model evaluation datasets, thereby rendering "DomainSpeech" an ideal candidate for validating the performance of ASR systems in recognizing domain-specific contents. The dataset's unique composition makes it a valuable asset for researchers and developers aiming to enhance the accuracy and reliability of ASR systems in academic and professional settings.
## Dataset Description
DomainSpeech is composed of 199 subsets, each contributing 300 rows of domain-specific English text data and corresponding 22050 Hz speech data. Each subset name takes a form as {domain}_{subdomain}. Although DomainSpeech mainly focuses on evaluation of ASR models, it also have extra 1500 rows for fine-tuning with some subdomains (Anatomy, Anthropology, Cardiology, Dentistry, Pathology).
## How to Use
To utilize the "DomainSpeech" dataset, especially focusing on a subset such as 'Medical Sciences_Anatomy,' you can follow the simple steps outlined below. This example demonstrates how to load the 'Medical Sciences_Anatomy' subset from the dataset for further analysis or model evaluation.
```python
from datasets import load_dataset
# Load the 'Medical Sciences_Anatomy' subset from the 'DomainSpeech' dataset
dataset = load_dataset("DoSp/DomainSpeech", "Medical Sciences_Anatomy")
```
## Evaluation Example
Can be found on our Paper "DomainSpeech: Domain Specific Corpus to Evaluate and Enhance ASR System"
| | Anatomy | Anthropology | Cardiology | Dentistry | Pathology |
| ----------------- | ----- | ----- | ----- | ----- | ----- |
| **Whisper-small** | - | - | - | - | - |
| **Baseline** | 9.19 | 9.19 | 13.25 | 9.76 | 11.92 |
| **T5-base** |8.49 | 7.15 | 9.7 | 8.60 | 11.16 |
| **Whisper-large-v2** | - | - | - | - | - |
| **Baseline** | 3.98 | 3.19 | 6.17 | 4.33 | 6.85 |
| **T5-base** | 3.84 | 4.31 | 4.34 | 4.00 | 7.83 |
|
open-llm-leaderboard-old/details_wandb__mistral-7b-zephyr-sft | open-llm-leaderboard-old | "2024-03-11T19:13:49Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:25:17Z" | ---
pretty_name: Evaluation run of wandb/mistral-7b-zephyr-sft
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wandb/mistral-7b-zephyr-sft](https://huggingface.co/wandb/mistral-7b-zephyr-sft)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wandb__mistral-7b-zephyr-sft\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T19:11:05.830446](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__mistral-7b-zephyr-sft/blob/main/results_2024-03-11T19-11-05.830446.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6220033551825916,\n\
\ \"acc_stderr\": 0.03273913649788547,\n \"acc_norm\": 0.6267034317926227,\n\
\ \"acc_norm_stderr\": 0.033396874508558105,\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.53067255609753,\n\
\ \"mc2_stderr\": 0.0154935431075672\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5827645051194539,\n \"acc_stderr\": 0.014409825518403077,\n\
\ \"acc_norm\": 0.6228668941979523,\n \"acc_norm_stderr\": 0.014163366896192603\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6535550687114121,\n\
\ \"acc_stderr\": 0.0047486451332815725,\n \"acc_norm\": 0.8488348934475204,\n\
\ \"acc_norm_stderr\": 0.0035747765941085046\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316092,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316092\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.028637235639800893,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.028637235639800893\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036845,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036845\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5446808510638298,\n \"acc_stderr\": 0.03255525359340355,\n\
\ \"acc_norm\": 0.5446808510638298,\n \"acc_norm_stderr\": 0.03255525359340355\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086923992,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086923992\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768176,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768176\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7483870967741936,\n\
\ \"acc_stderr\": 0.024685979286239963,\n \"acc_norm\": 0.7483870967741936,\n\
\ \"acc_norm_stderr\": 0.024685979286239963\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.049236596391733084,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.049236596391733084\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7676767676767676,\n \"acc_stderr\": 0.03008862949021749,\n \"\
acc_norm\": 0.7676767676767676,\n \"acc_norm_stderr\": 0.03008862949021749\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6,\n \"acc_stderr\": 0.02483881198803316,\n \"acc_norm\"\
: 0.6,\n \"acc_norm_stderr\": 0.02483881198803316\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n\
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.01678548115920363,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.01678548115920363\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7696078431372549,\n \"acc_stderr\": 0.029554292605695066,\n \"\
acc_norm\": 0.7696078431372549,\n \"acc_norm_stderr\": 0.029554292605695066\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8058252427184466,\n \"acc_stderr\": 0.03916667762822585,\n\
\ \"acc_norm\": 0.8058252427184466,\n \"acc_norm_stderr\": 0.03916667762822585\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7828863346104725,\n\
\ \"acc_stderr\": 0.014743125394823298,\n \"acc_norm\": 0.7828863346104725,\n\
\ \"acc_norm_stderr\": 0.014743125394823298\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624734,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624734\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.01611523550486548,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.01611523550486548\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7124183006535948,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.7124183006535948,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799215,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799215\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4452411994784876,\n\
\ \"acc_stderr\": 0.012693421303973294,\n \"acc_norm\": 0.4452411994784876,\n\
\ \"acc_norm_stderr\": 0.012693421303973294\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330433,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330433\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.034873508801977704,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.034873508801977704\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7777777777777778,\n \"acc_stderr\": 0.031885780176863984,\n\
\ \"acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.031885780176863984\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3671970624235006,\n\
\ \"mc1_stderr\": 0.01687480500145318,\n \"mc2\": 0.53067255609753,\n\
\ \"mc2_stderr\": 0.0154935431075672\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089693\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.422289613343442,\n \
\ \"acc_stderr\": 0.013605126449611874\n }\n}\n```"
repo_url: https://huggingface.co/wandb/mistral-7b-zephyr-sft
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-23-02.306339.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-11-05.830446.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T19-11-05.830446.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- '**/details_harness|winogrande|5_2024-03-09T23-23-02.306339.parquet'
- split: 2024_03_11T19_11_05.830446
path:
- '**/details_harness|winogrande|5_2024-03-11T19-11-05.830446.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T19-11-05.830446.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_23_02.306339
path:
- results_2024-03-09T23-23-02.306339.parquet
- split: 2024_03_11T19_11_05.830446
path:
- results_2024-03-11T19-11-05.830446.parquet
- split: latest
path:
- results_2024-03-11T19-11-05.830446.parquet
---
# Dataset Card for Evaluation run of wandb/mistral-7b-zephyr-sft
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wandb/mistral-7b-zephyr-sft](https://huggingface.co/wandb/mistral-7b-zephyr-sft) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wandb__mistral-7b-zephyr-sft",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T19:11:05.830446](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__mistral-7b-zephyr-sft/blob/main/results_2024-03-11T19-11-05.830446.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6220033551825916,
"acc_stderr": 0.03273913649788547,
"acc_norm": 0.6267034317926227,
"acc_norm_stderr": 0.033396874508558105,
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.53067255609753,
"mc2_stderr": 0.0154935431075672
},
"harness|arc:challenge|25": {
"acc": 0.5827645051194539,
"acc_stderr": 0.014409825518403077,
"acc_norm": 0.6228668941979523,
"acc_norm_stderr": 0.014163366896192603
},
"harness|hellaswag|10": {
"acc": 0.6535550687114121,
"acc_stderr": 0.0047486451332815725,
"acc_norm": 0.8488348934475204,
"acc_norm_stderr": 0.0035747765941085046
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316092,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316092
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.028637235639800893,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.028637235639800893
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036845,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036845
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5446808510638298,
"acc_stderr": 0.03255525359340355,
"acc_norm": 0.5446808510638298,
"acc_norm_stderr": 0.03255525359340355
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086923992,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086923992
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768176,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768176
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7483870967741936,
"acc_stderr": 0.024685979286239963,
"acc_norm": 0.7483870967741936,
"acc_norm_stderr": 0.024685979286239963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.6,
"acc_stderr": 0.049236596391733084,
"acc_norm": 0.6,
"acc_norm_stderr": 0.049236596391733084
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7676767676767676,
"acc_stderr": 0.03008862949021749,
"acc_norm": 0.7676767676767676,
"acc_norm_stderr": 0.03008862949021749
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6,
"acc_stderr": 0.02483881198803316,
"acc_norm": 0.6,
"acc_norm_stderr": 0.02483881198803316
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.01678548115920363,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.01678548115920363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7696078431372549,
"acc_stderr": 0.029554292605695066,
"acc_norm": 0.7696078431372549,
"acc_norm_stderr": 0.029554292605695066
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8058252427184466,
"acc_stderr": 0.03916667762822585,
"acc_norm": 0.8058252427184466,
"acc_norm_stderr": 0.03916667762822585
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7828863346104725,
"acc_stderr": 0.014743125394823298,
"acc_norm": 0.7828863346104725,
"acc_norm_stderr": 0.014743125394823298
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624734,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624734
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.01611523550486548,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.01611523550486548
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7124183006535948,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.7124183006535948,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799215,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799215
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4452411994784876,
"acc_stderr": 0.012693421303973294,
"acc_norm": 0.4452411994784876,
"acc_norm_stderr": 0.012693421303973294
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330433,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330433
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.034873508801977704,
"acc_norm": 0.86,
"acc_norm_stderr": 0.034873508801977704
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.031885780176863984,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.031885780176863984
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3671970624235006,
"mc1_stderr": 0.01687480500145318,
"mc2": 0.53067255609753,
"mc2_stderr": 0.0154935431075672
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089693
},
"harness|gsm8k|5": {
"acc": 0.422289613343442,
"acc_stderr": 0.013605126449611874
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_InnerI__InnerI-AI-sn6-7B-slerp | open-llm-leaderboard-old | "2024-03-09T23:30:08Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-09T23:29:48Z" | ---
pretty_name: Evaluation run of InnerI/InnerI-AI-sn6-7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [InnerI/InnerI-AI-sn6-7B-slerp](https://huggingface.co/InnerI/InnerI-AI-sn6-7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InnerI__InnerI-AI-sn6-7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-09T23:27:33.296041](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerI-AI-sn6-7B-slerp/blob/main/results_2024-03-09T23-27-33.296041.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5863710004612449,\n\
\ \"acc_stderr\": 0.03343473736042695,\n \"acc_norm\": 0.5912816909298625,\n\
\ \"acc_norm_stderr\": 0.034112482625008565,\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5470215227427601,\n\
\ \"mc2_stderr\": 0.015011831793917758\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5298634812286689,\n \"acc_stderr\": 0.014585305840007105,\n\
\ \"acc_norm\": 0.5836177474402731,\n \"acc_norm_stderr\": 0.014405618279436174\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5883290181238797,\n\
\ \"acc_stderr\": 0.0049113035697697935,\n \"acc_norm\": 0.7758414658434575,\n\
\ \"acc_norm_stderr\": 0.004161746750401134\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526066,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526066\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5481481481481482,\n\
\ \"acc_stderr\": 0.04299268905480864,\n \"acc_norm\": 0.5481481481481482,\n\
\ \"acc_norm_stderr\": 0.04299268905480864\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880267,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880267\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6319444444444444,\n\
\ \"acc_stderr\": 0.04032999053960719,\n \"acc_norm\": 0.6319444444444444,\n\
\ \"acc_norm_stderr\": 0.04032999053960719\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.046550104113196177,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.046550104113196177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.41228070175438597,\n\
\ \"acc_stderr\": 0.046306532033665956,\n \"acc_norm\": 0.41228070175438597,\n\
\ \"acc_norm_stderr\": 0.046306532033665956\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.36507936507936506,\n\
\ \"acc_stderr\": 0.04306241259127153,\n \"acc_norm\": 0.36507936507936506,\n\
\ \"acc_norm_stderr\": 0.04306241259127153\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7032258064516129,\n\
\ \"acc_stderr\": 0.025988500792411898,\n \"acc_norm\": 0.7032258064516129,\n\
\ \"acc_norm_stderr\": 0.025988500792411898\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.035025446508458714,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.035025446508458714\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7272727272727273,\n \"acc_stderr\": 0.0347769116216366,\n\
\ \"acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.0347769116216366\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7424242424242424,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.7424242424242424,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8031088082901554,\n \"acc_stderr\": 0.02869787397186068,\n\
\ \"acc_norm\": 0.8031088082901554,\n \"acc_norm_stderr\": 0.02869787397186068\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5692307692307692,\n \"acc_stderr\": 0.025106820660539753,\n\
\ \"acc_norm\": 0.5692307692307692,\n \"acc_norm_stderr\": 0.025106820660539753\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.028578348365473082,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.028578348365473082\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5546218487394958,\n \"acc_stderr\": 0.0322841062671639,\n \
\ \"acc_norm\": 0.5546218487394958,\n \"acc_norm_stderr\": 0.0322841062671639\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.40397350993377484,\n \"acc_stderr\": 0.040064856853653415,\n \"\
acc_norm\": 0.40397350993377484,\n \"acc_norm_stderr\": 0.040064856853653415\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7614678899082569,\n \"acc_stderr\": 0.018272575810231874,\n \"\
acc_norm\": 0.7614678899082569,\n \"acc_norm_stderr\": 0.018272575810231874\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.02977177522814563,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.02977177522814563\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.02798569938703642,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.02798569938703642\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6412556053811659,\n\
\ \"acc_stderr\": 0.03219079200419996,\n \"acc_norm\": 0.6412556053811659,\n\
\ \"acc_norm_stderr\": 0.03219079200419996\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7480916030534351,\n \"acc_stderr\": 0.03807387116306085,\n\
\ \"acc_norm\": 0.7480916030534351,\n \"acc_norm_stderr\": 0.03807387116306085\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7239263803680982,\n \"acc_stderr\": 0.035123852837050475,\n\
\ \"acc_norm\": 0.7239263803680982,\n \"acc_norm_stderr\": 0.035123852837050475\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.04793724854411022,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.04793724854411022\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7662835249042146,\n\
\ \"acc_stderr\": 0.015133383278988832,\n \"acc_norm\": 0.7662835249042146,\n\
\ \"acc_norm_stderr\": 0.015133383278988832\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6445086705202312,\n \"acc_stderr\": 0.025770292082977247,\n\
\ \"acc_norm\": 0.6445086705202312,\n \"acc_norm_stderr\": 0.025770292082977247\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3016759776536313,\n\
\ \"acc_stderr\": 0.015350767572220286,\n \"acc_norm\": 0.3016759776536313,\n\
\ \"acc_norm_stderr\": 0.015350767572220286\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6111111111111112,\n \"acc_stderr\": 0.027914055510468008,\n\
\ \"acc_norm\": 0.6111111111111112,\n \"acc_norm_stderr\": 0.027914055510468008\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.02666441088693762,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.02666441088693762\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6388888888888888,\n \"acc_stderr\": 0.02672586880910079,\n\
\ \"acc_norm\": 0.6388888888888888,\n \"acc_norm_stderr\": 0.02672586880910079\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.3900709219858156,\n \"acc_stderr\": 0.029097675599463926,\n \
\ \"acc_norm\": 0.3900709219858156,\n \"acc_norm_stderr\": 0.029097675599463926\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4061277705345502,\n\
\ \"acc_stderr\": 0.012543154588412935,\n \"acc_norm\": 0.4061277705345502,\n\
\ \"acc_norm_stderr\": 0.012543154588412935\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5330882352941176,\n \"acc_stderr\": 0.030306257722468317,\n\
\ \"acc_norm\": 0.5330882352941176,\n \"acc_norm_stderr\": 0.030306257722468317\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5882352941176471,\n \"acc_stderr\": 0.019910377463105935,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.019910377463105935\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6454545454545455,\n\
\ \"acc_stderr\": 0.045820048415054174,\n \"acc_norm\": 0.6454545454545455,\n\
\ \"acc_norm_stderr\": 0.045820048415054174\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.031362502409358936,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.031362502409358936\n \
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017197,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017197\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.46987951807228917,\n\
\ \"acc_stderr\": 0.03885425420866766,\n \"acc_norm\": 0.46987951807228917,\n\
\ \"acc_norm_stderr\": 0.03885425420866766\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.386780905752754,\n\
\ \"mc1_stderr\": 0.01704885701051511,\n \"mc2\": 0.5470215227427601,\n\
\ \"mc2_stderr\": 0.015011831793917758\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7292817679558011,\n \"acc_stderr\": 0.012487904760626304\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.3949962092494314,\n \
\ \"acc_stderr\": 0.013465354969973208\n }\n}\n```"
repo_url: https://huggingface.co/InnerI/InnerI-AI-sn6-7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-27-33.296041.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-09T23-27-33.296041.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- '**/details_harness|winogrande|5_2024-03-09T23-27-33.296041.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-09T23-27-33.296041.parquet'
- config_name: results
data_files:
- split: 2024_03_09T23_27_33.296041
path:
- results_2024-03-09T23-27-33.296041.parquet
- split: latest
path:
- results_2024-03-09T23-27-33.296041.parquet
---
# Dataset Card for Evaluation run of InnerI/InnerI-AI-sn6-7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InnerI/InnerI-AI-sn6-7B-slerp](https://huggingface.co/InnerI/InnerI-AI-sn6-7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InnerI__InnerI-AI-sn6-7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-09T23:27:33.296041](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__InnerI-AI-sn6-7B-slerp/blob/main/results_2024-03-09T23-27-33.296041.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5863710004612449,
"acc_stderr": 0.03343473736042695,
"acc_norm": 0.5912816909298625,
"acc_norm_stderr": 0.034112482625008565,
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5470215227427601,
"mc2_stderr": 0.015011831793917758
},
"harness|arc:challenge|25": {
"acc": 0.5298634812286689,
"acc_stderr": 0.014585305840007105,
"acc_norm": 0.5836177474402731,
"acc_norm_stderr": 0.014405618279436174
},
"harness|hellaswag|10": {
"acc": 0.5883290181238797,
"acc_stderr": 0.0049113035697697935,
"acc_norm": 0.7758414658434575,
"acc_norm_stderr": 0.004161746750401134
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526066,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526066
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5481481481481482,
"acc_stderr": 0.04299268905480864,
"acc_norm": 0.5481481481481482,
"acc_norm_stderr": 0.04299268905480864
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880267,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880267
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6319444444444444,
"acc_stderr": 0.04032999053960719,
"acc_norm": 0.6319444444444444,
"acc_norm_stderr": 0.04032999053960719
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.046550104113196177,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.046550104113196177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.41228070175438597,
"acc_stderr": 0.046306532033665956,
"acc_norm": 0.41228070175438597,
"acc_norm_stderr": 0.046306532033665956
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.36507936507936506,
"acc_stderr": 0.04306241259127153,
"acc_norm": 0.36507936507936506,
"acc_norm_stderr": 0.04306241259127153
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7032258064516129,
"acc_stderr": 0.025988500792411898,
"acc_norm": 0.7032258064516129,
"acc_norm_stderr": 0.025988500792411898
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.035025446508458714,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.035025446508458714
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.0347769116216366,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.0347769116216366
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7424242424242424,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.7424242424242424,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8031088082901554,
"acc_stderr": 0.02869787397186068,
"acc_norm": 0.8031088082901554,
"acc_norm_stderr": 0.02869787397186068
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5692307692307692,
"acc_stderr": 0.025106820660539753,
"acc_norm": 0.5692307692307692,
"acc_norm_stderr": 0.025106820660539753
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.028578348365473082,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.028578348365473082
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5546218487394958,
"acc_stderr": 0.0322841062671639,
"acc_norm": 0.5546218487394958,
"acc_norm_stderr": 0.0322841062671639
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.40397350993377484,
"acc_stderr": 0.040064856853653415,
"acc_norm": 0.40397350993377484,
"acc_norm_stderr": 0.040064856853653415
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7614678899082569,
"acc_stderr": 0.018272575810231874,
"acc_norm": 0.7614678899082569,
"acc_norm_stderr": 0.018272575810231874
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.02977177522814563,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.02977177522814563
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.02798569938703642,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.02798569938703642
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6412556053811659,
"acc_stderr": 0.03219079200419996,
"acc_norm": 0.6412556053811659,
"acc_norm_stderr": 0.03219079200419996
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7480916030534351,
"acc_stderr": 0.03807387116306085,
"acc_norm": 0.7480916030534351,
"acc_norm_stderr": 0.03807387116306085
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7239263803680982,
"acc_stderr": 0.035123852837050475,
"acc_norm": 0.7239263803680982,
"acc_norm_stderr": 0.035123852837050475
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.04793724854411022,
"acc_norm": 0.65,
"acc_norm_stderr": 0.04793724854411022
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7662835249042146,
"acc_stderr": 0.015133383278988832,
"acc_norm": 0.7662835249042146,
"acc_norm_stderr": 0.015133383278988832
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6445086705202312,
"acc_stderr": 0.025770292082977247,
"acc_norm": 0.6445086705202312,
"acc_norm_stderr": 0.025770292082977247
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3016759776536313,
"acc_stderr": 0.015350767572220286,
"acc_norm": 0.3016759776536313,
"acc_norm_stderr": 0.015350767572220286
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6111111111111112,
"acc_stderr": 0.027914055510468008,
"acc_norm": 0.6111111111111112,
"acc_norm_stderr": 0.027914055510468008
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.02666441088693762,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.02666441088693762
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6388888888888888,
"acc_stderr": 0.02672586880910079,
"acc_norm": 0.6388888888888888,
"acc_norm_stderr": 0.02672586880910079
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.3900709219858156,
"acc_stderr": 0.029097675599463926,
"acc_norm": 0.3900709219858156,
"acc_norm_stderr": 0.029097675599463926
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4061277705345502,
"acc_stderr": 0.012543154588412935,
"acc_norm": 0.4061277705345502,
"acc_norm_stderr": 0.012543154588412935
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5330882352941176,
"acc_stderr": 0.030306257722468317,
"acc_norm": 0.5330882352941176,
"acc_norm_stderr": 0.030306257722468317
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.019910377463105935,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.019910377463105935
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6454545454545455,
"acc_stderr": 0.045820048415054174,
"acc_norm": 0.6454545454545455,
"acc_norm_stderr": 0.045820048415054174
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6,
"acc_stderr": 0.031362502409358936,
"acc_norm": 0.6,
"acc_norm_stderr": 0.031362502409358936
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017197,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017197
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.46987951807228917,
"acc_stderr": 0.03885425420866766,
"acc_norm": 0.46987951807228917,
"acc_norm_stderr": 0.03885425420866766
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.386780905752754,
"mc1_stderr": 0.01704885701051511,
"mc2": 0.5470215227427601,
"mc2_stderr": 0.015011831793917758
},
"harness|winogrande|5": {
"acc": 0.7292817679558011,
"acc_stderr": 0.012487904760626304
},
"harness|gsm8k|5": {
"acc": 0.3949962092494314,
"acc_stderr": 0.013465354969973208
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
iocuydi/amharic-dolly-15k | iocuydi | "2024-03-29T03:55:00Z" | 0 | 0 | [
"license:cc-by-sa-3.0",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"arxiv:2403.06354",
"region:us"
] | null | "2024-03-09T23:46:21Z" | ---
license: cc-by-sa-3.0
---
Amharic version of the Dolly dataset (https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm)
Translated with this https://github.com/iocuydi/amharic-llama-llava/blob/main/data/prepare_amharic_data.py
More details: https://arxiv.org/abs/2403.06354 |
sauravjoshi23/2wikimultihopqa_mistral | sauravjoshi23 | "2024-03-09T23:55:11Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-09T23:47:57Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 617405907
num_examples: 167454
download_size: 319774951
dataset_size: 617405907
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
sproos/cosmopedia-100k-v0-activations | sproos | "2024-03-09T23:50:36Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-09T23:50:23Z" | ---
dataset_info:
features:
- name: text
dtype: string
- name: embedding
sequence: float64
- name: activations
sequence: float64
splits:
- name: train
num_bytes: 72548823.52993
num_examples: 2993
download_size: 16765503
dataset_size: 72548823.52993
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iocuydi/amharic-OASST1-pruned | iocuydi | "2024-03-29T03:54:47Z" | 0 | 0 | [
"license:apache-2.0",
"arxiv:2403.06354",
"region:us"
] | null | "2024-03-09T23:54:15Z" | ---
license: apache-2.0
---
Amharic version of a pruned set of OpenAssistant dataset (https://huggingface.co/datasets/OpenAssistant/oasst1).
The dataset was pruned to include only top rated conversation trees.
The English pruned version is included alongside the Amharic translated dataset.
Translation was done with Seamless M4T with the methods here: https://github.com/iocuydi/amharic-llama-llava/tree/main/translation
More details: https://arxiv.org/abs/2403.06354 |
sauravjoshi23/musique_mistral | sauravjoshi23 | "2024-03-10T00:22:24Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T00:21:43Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 206364842
num_examples: 19938
download_size: 89484342
dataset_size: 206364842
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Ultimatech/InteriorDesign | Ultimatech | "2024-03-10T01:55:26Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-03-10T00:27:37Z" | ---
license: apache-2.0
---
|
hayden-donnelly/colored-primitives | hayden-donnelly | "2024-04-08T18:32:06Z" | 0 | 1 | [
"task_categories:unconditional-image-generation",
"license:cc0-1.0",
"size_categories:100K<n<1M",
"format:parquet",
"modality:image",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"unconditional-image-generation"
] | "2024-03-10T00:28:58Z" | ---
license: cc0-1.0
task_categories:
- unconditional-image-generation
pretty_name: Colored Primitives
size_categories:
- 100K<n<1M
---
# Colored Primitives
A toy dataset for unconditional image generation. It consists of 152k renders of 3D primitives at a resolution of 128x128 pixels.
![image/png](https://cdn-uploads.huggingface.co/production/uploads/643ae6350e5495afdefb26e1/JjCUaDFlbT9WUss6VJCoY.png)
## Metadata
- Image resolution: 128x128
- Image encoding: PNG
- Image count: 152,000 |
open-llm-leaderboard-old/details_crumb__nano-mistral | open-llm-leaderboard-old | "2024-03-10T00:30:03Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T00:29:42Z" | ---
pretty_name: Evaluation run of crumb/nano-mistral
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [crumb/nano-mistral](https://huggingface.co/crumb/nano-mistral) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_crumb__nano-mistral\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:27:55.291023](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__nano-mistral/blob/main/results_2024-03-10T00-27-55.291023.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2511203777273379,\n\
\ \"acc_stderr\": 0.03056453888033514,\n \"acc_norm\": 0.2515472364859563,\n\
\ \"acc_norm_stderr\": 0.03137819323372588,\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.4741805949532385,\n\
\ \"mc2_stderr\": 0.015622084311020428\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.17747440273037543,\n \"acc_stderr\": 0.01116513876964396,\n\
\ \"acc_norm\": 0.2167235494880546,\n \"acc_norm_stderr\": 0.01204015671348119\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27743477394941246,\n\
\ \"acc_stderr\": 0.004468178273665653,\n \"acc_norm\": 0.28520215096594304,\n\
\ \"acc_norm_stderr\": 0.004505879084606852\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2518518518518518,\n\
\ \"acc_stderr\": 0.03749850709174022,\n \"acc_norm\": 0.2518518518518518,\n\
\ \"acc_norm_stderr\": 0.03749850709174022\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.02605529690115292,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.02605529690115292\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.04093601807403326\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198813,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198813\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.04336432707993177,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.04336432707993177\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.2,\n \"acc_stderr\": 0.04020151261036843,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.04020151261036843\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.26382978723404255,\n \"acc_stderr\": 0.02880998985410297,\n\
\ \"acc_norm\": 0.26382978723404255,\n \"acc_norm_stderr\": 0.02880998985410297\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.040493392977481404,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.040493392977481404\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.27586206896551724,\n \"acc_stderr\": 0.037245636197746325,\n\
\ \"acc_norm\": 0.27586206896551724,\n \"acc_norm_stderr\": 0.037245636197746325\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.18253968253968253,\n\
\ \"acc_stderr\": 0.03455071019102146,\n \"acc_norm\": 0.18253968253968253,\n\
\ \"acc_norm_stderr\": 0.03455071019102146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.15,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.15,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3032258064516129,\n \"acc_stderr\": 0.02614868593067175,\n \"\
acc_norm\": 0.3032258064516129,\n \"acc_norm_stderr\": 0.02614868593067175\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.270935960591133,\n \"acc_stderr\": 0.031270907132976984,\n \"\
acc_norm\": 0.270935960591133,\n \"acc_norm_stderr\": 0.031270907132976984\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.22424242424242424,\n \"acc_stderr\": 0.032568666616811015,\n\
\ \"acc_norm\": 0.22424242424242424,\n \"acc_norm_stderr\": 0.032568666616811015\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178253,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178253\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073828,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073828\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23949579831932774,\n \"acc_stderr\": 0.027722065493361273,\n\
\ \"acc_norm\": 0.23949579831932774,\n \"acc_norm_stderr\": 0.027722065493361273\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.2,\n \"acc_stderr\": 0.01714985851425095,\n \"acc_norm\": 0.2,\n\
\ \"acc_norm_stderr\": 0.01714985851425095\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.4722222222222222,\n \"acc_stderr\": 0.0340470532865388,\n\
\ \"acc_norm\": 0.4722222222222222,\n \"acc_norm_stderr\": 0.0340470532865388\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24509803921568626,\n \"acc_stderr\": 0.030190282453501954,\n \"\
acc_norm\": 0.24509803921568626,\n \"acc_norm_stderr\": 0.030190282453501954\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.26582278481012656,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.26582278481012656,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.30493273542600896,\n\
\ \"acc_stderr\": 0.030898610882477518,\n \"acc_norm\": 0.30493273542600896,\n\
\ \"acc_norm_stderr\": 0.030898610882477518\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.24427480916030533,\n \"acc_stderr\": 0.03768335959728742,\n\
\ \"acc_norm\": 0.24427480916030533,\n \"acc_norm_stderr\": 0.03768335959728742\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.3140495867768595,\n \"acc_stderr\": 0.042369647530410184,\n \"\
acc_norm\": 0.3140495867768595,\n \"acc_norm_stderr\": 0.042369647530410184\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252628,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252628\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24358974358974358,\n\
\ \"acc_stderr\": 0.028120966503914387,\n \"acc_norm\": 0.24358974358974358,\n\
\ \"acc_norm_stderr\": 0.028120966503914387\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.24265644955300128,\n\
\ \"acc_stderr\": 0.01532988894089986,\n \"acc_norm\": 0.24265644955300128,\n\
\ \"acc_norm_stderr\": 0.01532988894089986\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.21241830065359477,\n \"acc_stderr\": 0.02342037547829613,\n\
\ \"acc_norm\": 0.21241830065359477,\n \"acc_norm_stderr\": 0.02342037547829613\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.20257234726688103,\n\
\ \"acc_stderr\": 0.02282731749105969,\n \"acc_norm\": 0.20257234726688103,\n\
\ \"acc_norm_stderr\": 0.02282731749105969\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2037037037037037,\n \"acc_stderr\": 0.02240967454730418,\n\
\ \"acc_norm\": 0.2037037037037037,\n \"acc_norm_stderr\": 0.02240967454730418\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2127659574468085,\n \"acc_stderr\": 0.024414612974307703,\n \
\ \"acc_norm\": 0.2127659574468085,\n \"acc_norm_stderr\": 0.024414612974307703\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24641460234680573,\n\
\ \"acc_stderr\": 0.011005971399927235,\n \"acc_norm\": 0.24641460234680573,\n\
\ \"acc_norm_stderr\": 0.011005971399927235\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.45955882352941174,\n \"acc_stderr\": 0.03027332507734575,\n\
\ \"acc_norm\": 0.45955882352941174,\n \"acc_norm_stderr\": 0.03027332507734575\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.22727272727272727,\n\
\ \"acc_stderr\": 0.040139645540727756,\n \"acc_norm\": 0.22727272727272727,\n\
\ \"acc_norm_stderr\": 0.040139645540727756\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.22857142857142856,\n \"acc_stderr\": 0.026882144922307744,\n\
\ \"acc_norm\": 0.22857142857142856,\n \"acc_norm_stderr\": 0.026882144922307744\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409214,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409214\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2710843373493976,\n\
\ \"acc_stderr\": 0.03460579907553027,\n \"acc_norm\": 0.2710843373493976,\n\
\ \"acc_norm_stderr\": 0.03460579907553027\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2222222222222222,\n \"acc_stderr\": 0.03188578017686399,\n\
\ \"acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.03188578017686399\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2607099143206854,\n\
\ \"mc1_stderr\": 0.015368841620766368,\n \"mc2\": 0.4741805949532385,\n\
\ \"mc2_stderr\": 0.015622084311020428\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5240726124704025,\n \"acc_stderr\": 0.014036189665395129\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/crumb/nano-mistral
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-27-55.291023.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- '**/details_harness|winogrande|5_2024-03-10T00-27-55.291023.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-27-55.291023.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_27_55.291023
path:
- results_2024-03-10T00-27-55.291023.parquet
- split: latest
path:
- results_2024-03-10T00-27-55.291023.parquet
---
# Dataset Card for Evaluation run of crumb/nano-mistral
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [crumb/nano-mistral](https://huggingface.co/crumb/nano-mistral) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_crumb__nano-mistral",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:27:55.291023](https://huggingface.co/datasets/open-llm-leaderboard/details_crumb__nano-mistral/blob/main/results_2024-03-10T00-27-55.291023.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2511203777273379,
"acc_stderr": 0.03056453888033514,
"acc_norm": 0.2515472364859563,
"acc_norm_stderr": 0.03137819323372588,
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.4741805949532385,
"mc2_stderr": 0.015622084311020428
},
"harness|arc:challenge|25": {
"acc": 0.17747440273037543,
"acc_stderr": 0.01116513876964396,
"acc_norm": 0.2167235494880546,
"acc_norm_stderr": 0.01204015671348119
},
"harness|hellaswag|10": {
"acc": 0.27743477394941246,
"acc_stderr": 0.004468178273665653,
"acc_norm": 0.28520215096594304,
"acc_norm_stderr": 0.004505879084606852
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2518518518518518,
"acc_stderr": 0.03749850709174022,
"acc_norm": 0.2518518518518518,
"acc_norm_stderr": 0.03749850709174022
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.02605529690115292,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.02605529690115292
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198813,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198813
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.04336432707993177,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.04336432707993177
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036843,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036843
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.26382978723404255,
"acc_stderr": 0.02880998985410297,
"acc_norm": 0.26382978723404255,
"acc_norm_stderr": 0.02880998985410297
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.040493392977481404,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.040493392977481404
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.27586206896551724,
"acc_stderr": 0.037245636197746325,
"acc_norm": 0.27586206896551724,
"acc_norm_stderr": 0.037245636197746325
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.18253968253968253,
"acc_stderr": 0.03455071019102146,
"acc_norm": 0.18253968253968253,
"acc_norm_stderr": 0.03455071019102146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.15,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.15,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3032258064516129,
"acc_stderr": 0.02614868593067175,
"acc_norm": 0.3032258064516129,
"acc_norm_stderr": 0.02614868593067175
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.270935960591133,
"acc_stderr": 0.031270907132976984,
"acc_norm": 0.270935960591133,
"acc_norm_stderr": 0.031270907132976984
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.22424242424242424,
"acc_stderr": 0.032568666616811015,
"acc_norm": 0.22424242424242424,
"acc_norm_stderr": 0.032568666616811015
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178253,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178253
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073828,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073828
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23949579831932774,
"acc_stderr": 0.027722065493361273,
"acc_norm": 0.23949579831932774,
"acc_norm_stderr": 0.027722065493361273
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.2,
"acc_stderr": 0.01714985851425095,
"acc_norm": 0.2,
"acc_norm_stderr": 0.01714985851425095
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4722222222222222,
"acc_stderr": 0.0340470532865388,
"acc_norm": 0.4722222222222222,
"acc_norm_stderr": 0.0340470532865388
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24509803921568626,
"acc_stderr": 0.030190282453501954,
"acc_norm": 0.24509803921568626,
"acc_norm_stderr": 0.030190282453501954
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.26582278481012656,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.26582278481012656,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.30493273542600896,
"acc_stderr": 0.030898610882477518,
"acc_norm": 0.30493273542600896,
"acc_norm_stderr": 0.030898610882477518
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.24427480916030533,
"acc_stderr": 0.03768335959728742,
"acc_norm": 0.24427480916030533,
"acc_norm_stderr": 0.03768335959728742
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.3140495867768595,
"acc_stderr": 0.042369647530410184,
"acc_norm": 0.3140495867768595,
"acc_norm_stderr": 0.042369647530410184
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252628,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252628
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24358974358974358,
"acc_stderr": 0.028120966503914387,
"acc_norm": 0.24358974358974358,
"acc_norm_stderr": 0.028120966503914387
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.24265644955300128,
"acc_stderr": 0.01532988894089986,
"acc_norm": 0.24265644955300128,
"acc_norm_stderr": 0.01532988894089986
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.21241830065359477,
"acc_stderr": 0.02342037547829613,
"acc_norm": 0.21241830065359477,
"acc_norm_stderr": 0.02342037547829613
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.20257234726688103,
"acc_stderr": 0.02282731749105969,
"acc_norm": 0.20257234726688103,
"acc_norm_stderr": 0.02282731749105969
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2037037037037037,
"acc_stderr": 0.02240967454730418,
"acc_norm": 0.2037037037037037,
"acc_norm_stderr": 0.02240967454730418
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2127659574468085,
"acc_stderr": 0.024414612974307703,
"acc_norm": 0.2127659574468085,
"acc_norm_stderr": 0.024414612974307703
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24641460234680573,
"acc_stderr": 0.011005971399927235,
"acc_norm": 0.24641460234680573,
"acc_norm_stderr": 0.011005971399927235
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.45955882352941174,
"acc_stderr": 0.03027332507734575,
"acc_norm": 0.45955882352941174,
"acc_norm_stderr": 0.03027332507734575
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.22727272727272727,
"acc_stderr": 0.040139645540727756,
"acc_norm": 0.22727272727272727,
"acc_norm_stderr": 0.040139645540727756
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.22857142857142856,
"acc_stderr": 0.026882144922307744,
"acc_norm": 0.22857142857142856,
"acc_norm_stderr": 0.026882144922307744
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409214,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409214
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2710843373493976,
"acc_stderr": 0.03460579907553027,
"acc_norm": 0.2710843373493976,
"acc_norm_stderr": 0.03460579907553027
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.03188578017686399,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.03188578017686399
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2607099143206854,
"mc1_stderr": 0.015368841620766368,
"mc2": 0.4741805949532385,
"mc2_stderr": 0.015622084311020428
},
"harness|winogrande|5": {
"acc": 0.5240726124704025,
"acc_stderr": 0.014036189665395129
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
thejorseman/Albures | thejorseman | "2024-03-23T23:24:10Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T00:31:03Z" | ---
license: apache-2.0
---
|
iocuydi/amharic-visual-instruction-tuning | iocuydi | "2024-03-29T03:54:11Z" | 0 | 0 | [
"license:apache-2.0",
"arxiv:2403.06354",
"region:us"
] | null | "2024-03-10T00:45:42Z" | ---
license: apache-2.0
---
Dataset used for finetuning step of Amharic llava.
More details: https://arxiv.org/abs/2403.06354 |
open-llm-leaderboard-old/details_g-ronimo__phi-2-OpenHermes-2.5-v2 | open-llm-leaderboard-old | "2024-03-10T00:51:17Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T00:50:56Z" | ---
pretty_name: Evaluation run of g-ronimo/phi-2-OpenHermes-2.5-v2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [g-ronimo/phi-2-OpenHermes-2.5-v2](https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5-v2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:49:09.888984](https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2/blob/main/results_2024-03-10T00-49-09.888984.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.564734458241999,\n\
\ \"acc_stderr\": 0.03391431521091429,\n \"acc_norm\": 0.5676857564160381,\n\
\ \"acc_norm_stderr\": 0.03461774832252384,\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.44887128521126124,\n\
\ \"mc2_stderr\": 0.015342799330160783\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5699658703071673,\n \"acc_stderr\": 0.01446763155913799,\n\
\ \"acc_norm\": 0.5844709897610921,\n \"acc_norm_stderr\": 0.014401366641216388\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5592511451902011,\n\
\ \"acc_stderr\": 0.004954622308738996,\n \"acc_norm\": 0.7456681935869349,\n\
\ \"acc_norm_stderr\": 0.004345949382382379\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.48148148148148145,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.48148148148148145,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.0403356566784832,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.0403356566784832\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5735849056603773,\n \"acc_stderr\": 0.03043779434298305,\n\
\ \"acc_norm\": 0.5735849056603773,\n \"acc_norm_stderr\": 0.03043779434298305\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6527777777777778,\n\
\ \"acc_stderr\": 0.039812405437178615,\n \"acc_norm\": 0.6527777777777778,\n\
\ \"acc_norm_stderr\": 0.039812405437178615\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\"\
: 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.03794012674697031,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.03794012674697031\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006717,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006717\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542128\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.39473684210526316,\n\
\ \"acc_stderr\": 0.045981880578165414,\n \"acc_norm\": 0.39473684210526316,\n\
\ \"acc_norm_stderr\": 0.045981880578165414\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5103448275862069,\n \"acc_stderr\": 0.04165774775728763,\n\
\ \"acc_norm\": 0.5103448275862069,\n \"acc_norm_stderr\": 0.04165774775728763\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.40476190476190477,\n \"acc_stderr\": 0.025279850397404907,\n \"\
acc_norm\": 0.40476190476190477,\n \"acc_norm_stderr\": 0.025279850397404907\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6645161290322581,\n \"acc_stderr\": 0.026860206444724352,\n \"\
acc_norm\": 0.6645161290322581,\n \"acc_norm_stderr\": 0.026860206444724352\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n \"\
acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6606060606060606,\n \"acc_stderr\": 0.036974422050315967,\n\
\ \"acc_norm\": 0.6606060606060606,\n \"acc_norm_stderr\": 0.036974422050315967\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7373737373737373,\n \"acc_stderr\": 0.03135305009533086,\n \"\
acc_norm\": 0.7373737373737373,\n \"acc_norm_stderr\": 0.03135305009533086\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7564766839378239,\n \"acc_stderr\": 0.030975436386845436,\n\
\ \"acc_norm\": 0.7564766839378239,\n \"acc_norm_stderr\": 0.030975436386845436\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5615384615384615,\n \"acc_stderr\": 0.025158266016868578,\n\
\ \"acc_norm\": 0.5615384615384615,\n \"acc_norm_stderr\": 0.025158266016868578\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2962962962962963,\n \"acc_stderr\": 0.027840811495871916,\n \
\ \"acc_norm\": 0.2962962962962963,\n \"acc_norm_stderr\": 0.027840811495871916\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.03196876989195778,\n \
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.03196876989195778\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7834862385321101,\n \"acc_stderr\": 0.017658710594443128,\n \"\
acc_norm\": 0.7834862385321101,\n \"acc_norm_stderr\": 0.017658710594443128\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4537037037037037,\n \"acc_stderr\": 0.033953227263757976,\n \"\
acc_norm\": 0.4537037037037037,\n \"acc_norm_stderr\": 0.033953227263757976\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6617647058823529,\n \"acc_stderr\": 0.03320574612945431,\n \"\
acc_norm\": 0.6617647058823529,\n \"acc_norm_stderr\": 0.03320574612945431\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7341772151898734,\n \"acc_stderr\": 0.028756799629658342,\n \
\ \"acc_norm\": 0.7341772151898734,\n \"acc_norm_stderr\": 0.028756799629658342\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6502242152466368,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.6502242152466368,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6335877862595419,\n \"acc_stderr\": 0.04225875451969638,\n\
\ \"acc_norm\": 0.6335877862595419,\n \"acc_norm_stderr\": 0.04225875451969638\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.6990291262135923,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.6990291262135923,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8205128205128205,\n\
\ \"acc_stderr\": 0.025140935950335428,\n \"acc_norm\": 0.8205128205128205,\n\
\ \"acc_norm_stderr\": 0.025140935950335428\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6756066411238825,\n\
\ \"acc_stderr\": 0.016740929047162696,\n \"acc_norm\": 0.6756066411238825,\n\
\ \"acc_norm_stderr\": 0.016740929047162696\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.661849710982659,\n \"acc_stderr\": 0.02546977014940017,\n\
\ \"acc_norm\": 0.661849710982659,\n \"acc_norm_stderr\": 0.02546977014940017\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2212290502793296,\n\
\ \"acc_stderr\": 0.01388216459888727,\n \"acc_norm\": 0.2212290502793296,\n\
\ \"acc_norm_stderr\": 0.01388216459888727\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6013071895424836,\n \"acc_stderr\": 0.02803609227389177,\n\
\ \"acc_norm\": 0.6013071895424836,\n \"acc_norm_stderr\": 0.02803609227389177\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6237942122186495,\n\
\ \"acc_stderr\": 0.02751392568354943,\n \"acc_norm\": 0.6237942122186495,\n\
\ \"acc_norm_stderr\": 0.02751392568354943\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6203703703703703,\n \"acc_stderr\": 0.027002521034516468,\n\
\ \"acc_norm\": 0.6203703703703703,\n \"acc_norm_stderr\": 0.027002521034516468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4078014184397163,\n \"acc_stderr\": 0.02931601177634356,\n \
\ \"acc_norm\": 0.4078014184397163,\n \"acc_norm_stderr\": 0.02931601177634356\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.40808344198174706,\n\
\ \"acc_stderr\": 0.012552598958563662,\n \"acc_norm\": 0.40808344198174706,\n\
\ \"acc_norm_stderr\": 0.012552598958563662\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.44485294117647056,\n \"acc_stderr\": 0.03018753206032939,\n\
\ \"acc_norm\": 0.44485294117647056,\n \"acc_norm_stderr\": 0.03018753206032939\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5718954248366013,\n \"acc_stderr\": 0.0200176292142131,\n \
\ \"acc_norm\": 0.5718954248366013,\n \"acc_norm_stderr\": 0.0200176292142131\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6938775510204082,\n \"acc_stderr\": 0.02950489645459595,\n\
\ \"acc_norm\": 0.6938775510204082,\n \"acc_norm_stderr\": 0.02950489645459595\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357303,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357303\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5,\n \
\ \"acc_stderr\": 0.03892494720807614,\n \"acc_norm\": 0.5,\n \"\
acc_norm_stderr\": 0.03892494720807614\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.035650796707083106,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.035650796707083106\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3011015911872705,\n\
\ \"mc1_stderr\": 0.01605899902610061,\n \"mc2\": 0.44887128521126124,\n\
\ \"mc2_stderr\": 0.015342799330160783\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7521704814522494,\n \"acc_stderr\": 0.012134386019865353\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4040940106141016,\n \
\ \"acc_stderr\": 0.013516752972721717\n }\n}\n```"
repo_url: https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5-v2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-49-09.888984.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- '**/details_harness|winogrande|5_2024-03-10T00-49-09.888984.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-49-09.888984.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_49_09.888984
path:
- results_2024-03-10T00-49-09.888984.parquet
- split: latest
path:
- results_2024-03-10T00-49-09.888984.parquet
---
# Dataset Card for Evaluation run of g-ronimo/phi-2-OpenHermes-2.5-v2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [g-ronimo/phi-2-OpenHermes-2.5-v2](https://huggingface.co/g-ronimo/phi-2-OpenHermes-2.5-v2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:49:09.888984](https://huggingface.co/datasets/open-llm-leaderboard/details_g-ronimo__phi-2-OpenHermes-2.5-v2/blob/main/results_2024-03-10T00-49-09.888984.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.564734458241999,
"acc_stderr": 0.03391431521091429,
"acc_norm": 0.5676857564160381,
"acc_norm_stderr": 0.03461774832252384,
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.44887128521126124,
"mc2_stderr": 0.015342799330160783
},
"harness|arc:challenge|25": {
"acc": 0.5699658703071673,
"acc_stderr": 0.01446763155913799,
"acc_norm": 0.5844709897610921,
"acc_norm_stderr": 0.014401366641216388
},
"harness|hellaswag|10": {
"acc": 0.5592511451902011,
"acc_stderr": 0.004954622308738996,
"acc_norm": 0.7456681935869349,
"acc_norm_stderr": 0.004345949382382379
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.48148148148148145,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.48148148148148145,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.0403356566784832,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.0403356566784832
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5735849056603773,
"acc_stderr": 0.03043779434298305,
"acc_norm": 0.5735849056603773,
"acc_norm_stderr": 0.03043779434298305
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6527777777777778,
"acc_stderr": 0.039812405437178615,
"acc_norm": 0.6527777777777778,
"acc_norm_stderr": 0.039812405437178615
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.03794012674697031,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.03794012674697031
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006717,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006717
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.39473684210526316,
"acc_stderr": 0.045981880578165414,
"acc_norm": 0.39473684210526316,
"acc_norm_stderr": 0.045981880578165414
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5103448275862069,
"acc_stderr": 0.04165774775728763,
"acc_norm": 0.5103448275862069,
"acc_norm_stderr": 0.04165774775728763
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.025279850397404907,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.025279850397404907
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6645161290322581,
"acc_stderr": 0.026860206444724352,
"acc_norm": 0.6645161290322581,
"acc_norm_stderr": 0.026860206444724352
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6606060606060606,
"acc_stderr": 0.036974422050315967,
"acc_norm": 0.6606060606060606,
"acc_norm_stderr": 0.036974422050315967
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7373737373737373,
"acc_stderr": 0.03135305009533086,
"acc_norm": 0.7373737373737373,
"acc_norm_stderr": 0.03135305009533086
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7564766839378239,
"acc_stderr": 0.030975436386845436,
"acc_norm": 0.7564766839378239,
"acc_norm_stderr": 0.030975436386845436
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5615384615384615,
"acc_stderr": 0.025158266016868578,
"acc_norm": 0.5615384615384615,
"acc_norm_stderr": 0.025158266016868578
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2962962962962963,
"acc_stderr": 0.027840811495871916,
"acc_norm": 0.2962962962962963,
"acc_norm_stderr": 0.027840811495871916
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.03196876989195778,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.03196876989195778
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7834862385321101,
"acc_stderr": 0.017658710594443128,
"acc_norm": 0.7834862385321101,
"acc_norm_stderr": 0.017658710594443128
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4537037037037037,
"acc_stderr": 0.033953227263757976,
"acc_norm": 0.4537037037037037,
"acc_norm_stderr": 0.033953227263757976
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6617647058823529,
"acc_stderr": 0.03320574612945431,
"acc_norm": 0.6617647058823529,
"acc_norm_stderr": 0.03320574612945431
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7341772151898734,
"acc_stderr": 0.028756799629658342,
"acc_norm": 0.7341772151898734,
"acc_norm_stderr": 0.028756799629658342
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6502242152466368,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.6502242152466368,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6335877862595419,
"acc_stderr": 0.04225875451969638,
"acc_norm": 0.6335877862595419,
"acc_norm_stderr": 0.04225875451969638
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.6990291262135923,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.6990291262135923,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8205128205128205,
"acc_stderr": 0.025140935950335428,
"acc_norm": 0.8205128205128205,
"acc_norm_stderr": 0.025140935950335428
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6756066411238825,
"acc_stderr": 0.016740929047162696,
"acc_norm": 0.6756066411238825,
"acc_norm_stderr": 0.016740929047162696
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.661849710982659,
"acc_stderr": 0.02546977014940017,
"acc_norm": 0.661849710982659,
"acc_norm_stderr": 0.02546977014940017
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2212290502793296,
"acc_stderr": 0.01388216459888727,
"acc_norm": 0.2212290502793296,
"acc_norm_stderr": 0.01388216459888727
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6013071895424836,
"acc_stderr": 0.02803609227389177,
"acc_norm": 0.6013071895424836,
"acc_norm_stderr": 0.02803609227389177
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6237942122186495,
"acc_stderr": 0.02751392568354943,
"acc_norm": 0.6237942122186495,
"acc_norm_stderr": 0.02751392568354943
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6203703703703703,
"acc_stderr": 0.027002521034516468,
"acc_norm": 0.6203703703703703,
"acc_norm_stderr": 0.027002521034516468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4078014184397163,
"acc_stderr": 0.02931601177634356,
"acc_norm": 0.4078014184397163,
"acc_norm_stderr": 0.02931601177634356
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.40808344198174706,
"acc_stderr": 0.012552598958563662,
"acc_norm": 0.40808344198174706,
"acc_norm_stderr": 0.012552598958563662
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.44485294117647056,
"acc_stderr": 0.03018753206032939,
"acc_norm": 0.44485294117647056,
"acc_norm_stderr": 0.03018753206032939
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5718954248366013,
"acc_stderr": 0.0200176292142131,
"acc_norm": 0.5718954248366013,
"acc_norm_stderr": 0.0200176292142131
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6938775510204082,
"acc_stderr": 0.02950489645459595,
"acc_norm": 0.6938775510204082,
"acc_norm_stderr": 0.02950489645459595
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357303,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357303
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5,
"acc_stderr": 0.03892494720807614,
"acc_norm": 0.5,
"acc_norm_stderr": 0.03892494720807614
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.035650796707083106,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.035650796707083106
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3011015911872705,
"mc1_stderr": 0.01605899902610061,
"mc2": 0.44887128521126124,
"mc2_stderr": 0.015342799330160783
},
"harness|winogrande|5": {
"acc": 0.7521704814522494,
"acc_stderr": 0.012134386019865353
},
"harness|gsm8k|5": {
"acc": 0.4040940106141016,
"acc_stderr": 0.013516752972721717
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Inv__Konstanta-V3-AlphaFlavour-7B | open-llm-leaderboard-old | "2024-03-10T00:54:34Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T00:54:14Z" | ---
pretty_name: Evaluation run of Inv/Konstanta-V3-AlphaFlavour-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Inv/Konstanta-V3-AlphaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-AlphaFlavour-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:51:57.811629](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B/blob/main/results_2024-03-10T00-51-57.811629.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6165673948352764,\n\
\ \"acc_stderr\": 0.03301622733382914,\n \"acc_norm\": 0.6173135100008581,\n\
\ \"acc_norm_stderr\": 0.03369417604002207,\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7194257395133424,\n\
\ \"mc2_stderr\": 0.014722025416322865\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6604095563139932,\n \"acc_stderr\": 0.013839039762820166,\n\
\ \"acc_norm\": 0.6885665529010239,\n \"acc_norm_stderr\": 0.01353247209985094\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6752638916550487,\n\
\ \"acc_stderr\": 0.004673191423861212,\n \"acc_norm\": 0.8684524995020912,\n\
\ \"acc_norm_stderr\": 0.0033730738635822915\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6716981132075471,\n \"acc_stderr\": 0.02890159361241178,\n\
\ \"acc_norm\": 0.6716981132075471,\n \"acc_norm_stderr\": 0.02890159361241178\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4019607843137255,\n \"acc_stderr\": 0.048786087144669955,\n\
\ \"acc_norm\": 0.4019607843137255,\n \"acc_norm_stderr\": 0.048786087144669955\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.43859649122807015,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.43859649122807015,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.02522545028406788,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.02522545028406788\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5838709677419355,\n\
\ \"acc_stderr\": 0.02804098138076154,\n \"acc_norm\": 0.5838709677419355,\n\
\ \"acc_norm_stderr\": 0.02804098138076154\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586815,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586815\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.02478431694215639,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.02478431694215639\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.027309140588230193,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.027309140588230193\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6386554621848739,\n \"acc_stderr\": 0.03120469122515002,\n \
\ \"acc_norm\": 0.6386554621848739,\n \"acc_norm_stderr\": 0.03120469122515002\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33112582781456956,\n \"acc_stderr\": 0.038425817186598696,\n \"\
acc_norm\": 0.33112582781456956,\n \"acc_norm_stderr\": 0.038425817186598696\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399306,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399306\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4675925925925926,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.4675925925925926,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.030587591351604246,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.030587591351604246\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7763713080168776,\n \"acc_stderr\": 0.027123298205229966,\n \
\ \"acc_norm\": 0.7763713080168776,\n \"acc_norm_stderr\": 0.027123298205229966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7099236641221374,\n \"acc_stderr\": 0.03980066246467766,\n\
\ \"acc_norm\": 0.7099236641221374,\n \"acc_norm_stderr\": 0.03980066246467766\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615624,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615624\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841403,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841403\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757431,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757431\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.02410571260775431,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.02410571260775431\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3776536312849162,\n\
\ \"acc_stderr\": 0.01621414875213663,\n \"acc_norm\": 0.3776536312849162,\n\
\ \"acc_norm_stderr\": 0.01621414875213663\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6895424836601307,\n \"acc_stderr\": 0.026493033225145898,\n\
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.026493033225145898\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6720257234726688,\n\
\ \"acc_stderr\": 0.026664410886937617,\n \"acc_norm\": 0.6720257234726688,\n\
\ \"acc_norm_stderr\": 0.026664410886937617\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6697530864197531,\n \"acc_stderr\": 0.026168298456732846,\n\
\ \"acc_norm\": 0.6697530864197531,\n \"acc_norm_stderr\": 0.026168298456732846\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.02979071924382972,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.02979071924382972\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.012683972513598818,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.012683972513598818\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.029408372932278746,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.029408372932278746\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6405228758169934,\n \"acc_stderr\": 0.01941253924203216,\n \
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.01941253924203216\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6069651741293532,\n\
\ \"acc_stderr\": 0.0345368246603156,\n \"acc_norm\": 0.6069651741293532,\n\
\ \"acc_norm_stderr\": 0.0345368246603156\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.02796678585916089,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.02796678585916089\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5740514075887393,\n\
\ \"mc1_stderr\": 0.01731047190407654,\n \"mc2\": 0.7194257395133424,\n\
\ \"mc2_stderr\": 0.014722025416322865\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8153117600631413,\n \"acc_stderr\": 0.01090597811215688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5951478392721758,\n \
\ \"acc_stderr\": 0.01352081766687051\n }\n}\n```"
repo_url: https://huggingface.co/Inv/Konstanta-V3-AlphaFlavour-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-51-57.811629.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- '**/details_harness|winogrande|5_2024-03-10T00-51-57.811629.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-51-57.811629.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_51_57.811629
path:
- results_2024-03-10T00-51-57.811629.parquet
- split: latest
path:
- results_2024-03-10T00-51-57.811629.parquet
---
# Dataset Card for Evaluation run of Inv/Konstanta-V3-AlphaFlavour-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/Konstanta-V3-AlphaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-AlphaFlavour-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:51:57.811629](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-AlphaFlavour-7B/blob/main/results_2024-03-10T00-51-57.811629.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6165673948352764,
"acc_stderr": 0.03301622733382914,
"acc_norm": 0.6173135100008581,
"acc_norm_stderr": 0.03369417604002207,
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7194257395133424,
"mc2_stderr": 0.014722025416322865
},
"harness|arc:challenge|25": {
"acc": 0.6604095563139932,
"acc_stderr": 0.013839039762820166,
"acc_norm": 0.6885665529010239,
"acc_norm_stderr": 0.01353247209985094
},
"harness|hellaswag|10": {
"acc": 0.6752638916550487,
"acc_stderr": 0.004673191423861212,
"acc_norm": 0.8684524995020912,
"acc_norm_stderr": 0.0033730738635822915
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6716981132075471,
"acc_stderr": 0.02890159361241178,
"acc_norm": 0.6716981132075471,
"acc_norm_stderr": 0.02890159361241178
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4019607843137255,
"acc_stderr": 0.048786087144669955,
"acc_norm": 0.4019607843137255,
"acc_norm_stderr": 0.048786087144669955
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.43859649122807015,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.43859649122807015,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.02522545028406788,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.02522545028406788
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5838709677419355,
"acc_stderr": 0.02804098138076154,
"acc_norm": 0.5838709677419355,
"acc_norm_stderr": 0.02804098138076154
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586815,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586815
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.02478431694215639,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.02478431694215639
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.027309140588230193,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.027309140588230193
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6386554621848739,
"acc_stderr": 0.03120469122515002,
"acc_norm": 0.6386554621848739,
"acc_norm_stderr": 0.03120469122515002
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33112582781456956,
"acc_stderr": 0.038425817186598696,
"acc_norm": 0.33112582781456956,
"acc_norm_stderr": 0.038425817186598696
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399306,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399306
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4675925925925926,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.4675925925925926,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.030587591351604246,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.030587591351604246
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7763713080168776,
"acc_stderr": 0.027123298205229966,
"acc_norm": 0.7763713080168776,
"acc_norm_stderr": 0.027123298205229966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7099236641221374,
"acc_stderr": 0.03980066246467766,
"acc_norm": 0.7099236641221374,
"acc_norm_stderr": 0.03980066246467766
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615624,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615624
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841403,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841403
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757431,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757431
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.02410571260775431,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.02410571260775431
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3776536312849162,
"acc_stderr": 0.01621414875213663,
"acc_norm": 0.3776536312849162,
"acc_norm_stderr": 0.01621414875213663
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.026493033225145898,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.026493033225145898
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6720257234726688,
"acc_stderr": 0.026664410886937617,
"acc_norm": 0.6720257234726688,
"acc_norm_stderr": 0.026664410886937617
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6697530864197531,
"acc_stderr": 0.026168298456732846,
"acc_norm": 0.6697530864197531,
"acc_norm_stderr": 0.026168298456732846
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.02979071924382972,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.02979071924382972
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598818,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598818
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.625,
"acc_stderr": 0.029408372932278746,
"acc_norm": 0.625,
"acc_norm_stderr": 0.029408372932278746
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.01941253924203216,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.01941253924203216
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6069651741293532,
"acc_stderr": 0.0345368246603156,
"acc_norm": 0.6069651741293532,
"acc_norm_stderr": 0.0345368246603156
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.02796678585916089,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.02796678585916089
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5740514075887393,
"mc1_stderr": 0.01731047190407654,
"mc2": 0.7194257395133424,
"mc2_stderr": 0.014722025416322865
},
"harness|winogrande|5": {
"acc": 0.8153117600631413,
"acc_stderr": 0.01090597811215688
},
"harness|gsm8k|5": {
"acc": 0.5951478392721758,
"acc_stderr": 0.01352081766687051
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Inv__Konstanta-V3-BetaFlavour-7B | open-llm-leaderboard-old | "2024-03-10T00:54:39Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T00:54:20Z" | ---
pretty_name: Evaluation run of Inv/Konstanta-V3-BetaFlavour-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Inv/Konstanta-V3-BetaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-BetaFlavour-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:52:07.659928](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B/blob/main/results_2024-03-10T00-52-07.659928.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6181847903415443,\n\
\ \"acc_stderr\": 0.03300581002480976,\n \"acc_norm\": 0.6193449608180348,\n\
\ \"acc_norm_stderr\": 0.03368163016889217,\n \"mc1\": 0.5862913096695227,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.729182943288448,\n\
\ \"mc2_stderr\": 0.01463892425987301\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.658703071672355,\n \"acc_stderr\": 0.01385583128749772,\n\
\ \"acc_norm\": 0.681740614334471,\n \"acc_norm_stderr\": 0.013611993916971453\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6781517625970922,\n\
\ \"acc_stderr\": 0.004662303395239621,\n \"acc_norm\": 0.8687512447719578,\n\
\ \"acc_norm_stderr\": 0.003369821004762251\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.55,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.53,\n \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\"\
: 0.53,\n \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.037038511930995215,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.037038511930995215\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663454,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663454\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5234042553191489,\n \"acc_stderr\": 0.03265019475033582,\n\
\ \"acc_norm\": 0.5234042553191489,\n \"acc_norm_stderr\": 0.03265019475033582\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4298245614035088,\n\
\ \"acc_stderr\": 0.04657047260594963,\n \"acc_norm\": 0.4298245614035088,\n\
\ \"acc_norm_stderr\": 0.04657047260594963\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.02535574126305527,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.02535574126305527\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4444444444444444,\n\
\ \"acc_stderr\": 0.044444444444444495,\n \"acc_norm\": 0.4444444444444444,\n\
\ \"acc_norm_stderr\": 0.044444444444444495\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.5741935483870968,\n\
\ \"acc_stderr\": 0.028129112709165904,\n \"acc_norm\": 0.5741935483870968,\n\
\ \"acc_norm_stderr\": 0.028129112709165904\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.49261083743842365,\n \"acc_stderr\": 0.03517603540361008,\n\
\ \"acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.03517603540361008\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.029376616484945627,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.029376616484945627\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8497409326424871,\n \"acc_stderr\": 0.025787723180723886,\n\
\ \"acc_norm\": 0.8497409326424871,\n \"acc_norm_stderr\": 0.025787723180723886\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6128205128205129,\n \"acc_stderr\": 0.02469721693087894,\n \
\ \"acc_norm\": 0.6128205128205129,\n \"acc_norm_stderr\": 0.02469721693087894\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2851851851851852,\n \"acc_stderr\": 0.027528599210340492,\n \
\ \"acc_norm\": 0.2851851851851852,\n \"acc_norm_stderr\": 0.027528599210340492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8256880733944955,\n \"acc_stderr\": 0.016265675632010347,\n \"\
acc_norm\": 0.8256880733944955,\n \"acc_norm_stderr\": 0.016265675632010347\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7549019607843137,\n \"acc_stderr\": 0.030190282453501947,\n \"\
acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.030190282453501947\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7848101265822784,\n \"acc_stderr\": 0.02675082699467618,\n \
\ \"acc_norm\": 0.7848101265822784,\n \"acc_norm_stderr\": 0.02675082699467618\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7251908396946565,\n \"acc_stderr\": 0.03915345408847836,\n\
\ \"acc_norm\": 0.7251908396946565,\n \"acc_norm_stderr\": 0.03915345408847836\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8007662835249042,\n\
\ \"acc_stderr\": 0.014283378044296418,\n \"acc_norm\": 0.8007662835249042,\n\
\ \"acc_norm_stderr\": 0.014283378044296418\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.41787709497206704,\n\
\ \"acc_stderr\": 0.016495400635820084,\n \"acc_norm\": 0.41787709497206704,\n\
\ \"acc_norm_stderr\": 0.016495400635820084\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7058823529411765,\n \"acc_stderr\": 0.026090162504279053,\n\
\ \"acc_norm\": 0.7058823529411765,\n \"acc_norm_stderr\": 0.026090162504279053\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6882716049382716,\n \"acc_stderr\": 0.02577311116963045,\n\
\ \"acc_norm\": 0.6882716049382716,\n \"acc_norm_stderr\": 0.02577311116963045\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873866,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873866\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.44198174706649285,\n\
\ \"acc_stderr\": 0.012683972513598816,\n \"acc_norm\": 0.44198174706649285,\n\
\ \"acc_norm_stderr\": 0.012683972513598816\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.019291961895066382,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.019291961895066382\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.582089552238806,\n\
\ \"acc_stderr\": 0.034875586404620636,\n \"acc_norm\": 0.582089552238806,\n\
\ \"acc_norm_stderr\": 0.034875586404620636\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \
\ \"acc_norm\": 0.8,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5862913096695227,\n\
\ \"mc1_stderr\": 0.0172408618120998,\n \"mc2\": 0.729182943288448,\n\
\ \"mc2_stderr\": 0.01463892425987301\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8129439621152328,\n \"acc_stderr\": 0.010959716435242912\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5686125852918877,\n \
\ \"acc_stderr\": 0.013642195352511563\n }\n}\n```"
repo_url: https://huggingface.co/Inv/Konstanta-V3-BetaFlavour-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-52-07.659928.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- '**/details_harness|winogrande|5_2024-03-10T00-52-07.659928.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-52-07.659928.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_52_07.659928
path:
- results_2024-03-10T00-52-07.659928.parquet
- split: latest
path:
- results_2024-03-10T00-52-07.659928.parquet
---
# Dataset Card for Evaluation run of Inv/Konstanta-V3-BetaFlavour-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Inv/Konstanta-V3-BetaFlavour-7B](https://huggingface.co/Inv/Konstanta-V3-BetaFlavour-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:52:07.659928](https://huggingface.co/datasets/open-llm-leaderboard/details_Inv__Konstanta-V3-BetaFlavour-7B/blob/main/results_2024-03-10T00-52-07.659928.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6181847903415443,
"acc_stderr": 0.03300581002480976,
"acc_norm": 0.6193449608180348,
"acc_norm_stderr": 0.03368163016889217,
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.729182943288448,
"mc2_stderr": 0.01463892425987301
},
"harness|arc:challenge|25": {
"acc": 0.658703071672355,
"acc_stderr": 0.01385583128749772,
"acc_norm": 0.681740614334471,
"acc_norm_stderr": 0.013611993916971453
},
"harness|hellaswag|10": {
"acc": 0.6781517625970922,
"acc_stderr": 0.004662303395239621,
"acc_norm": 0.8687512447719578,
"acc_norm_stderr": 0.003369821004762251
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.037038511930995215,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.037038511930995215
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663454,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663454
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.74,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5234042553191489,
"acc_stderr": 0.03265019475033582,
"acc_norm": 0.5234042553191489,
"acc_norm_stderr": 0.03265019475033582
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4298245614035088,
"acc_stderr": 0.04657047260594963,
"acc_norm": 0.4298245614035088,
"acc_norm_stderr": 0.04657047260594963
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.02535574126305527,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.02535574126305527
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4444444444444444,
"acc_stderr": 0.044444444444444495,
"acc_norm": 0.4444444444444444,
"acc_norm_stderr": 0.044444444444444495
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.5741935483870968,
"acc_stderr": 0.028129112709165904,
"acc_norm": 0.5741935483870968,
"acc_norm_stderr": 0.028129112709165904
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.03517603540361008,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.03517603540361008
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.029376616484945627,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.029376616484945627
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8497409326424871,
"acc_stderr": 0.025787723180723886,
"acc_norm": 0.8497409326424871,
"acc_norm_stderr": 0.025787723180723886
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6128205128205129,
"acc_stderr": 0.02469721693087894,
"acc_norm": 0.6128205128205129,
"acc_norm_stderr": 0.02469721693087894
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2851851851851852,
"acc_stderr": 0.027528599210340492,
"acc_norm": 0.2851851851851852,
"acc_norm_stderr": 0.027528599210340492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8256880733944955,
"acc_stderr": 0.016265675632010347,
"acc_norm": 0.8256880733944955,
"acc_norm_stderr": 0.016265675632010347
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.030190282453501947,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.030190282453501947
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7848101265822784,
"acc_stderr": 0.02675082699467618,
"acc_norm": 0.7848101265822784,
"acc_norm_stderr": 0.02675082699467618
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7251908396946565,
"acc_stderr": 0.03915345408847836,
"acc_norm": 0.7251908396946565,
"acc_norm_stderr": 0.03915345408847836
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8007662835249042,
"acc_stderr": 0.014283378044296418,
"acc_norm": 0.8007662835249042,
"acc_norm_stderr": 0.014283378044296418
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.41787709497206704,
"acc_stderr": 0.016495400635820084,
"acc_norm": 0.41787709497206704,
"acc_norm_stderr": 0.016495400635820084
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7058823529411765,
"acc_stderr": 0.026090162504279053,
"acc_norm": 0.7058823529411765,
"acc_norm_stderr": 0.026090162504279053
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6882716049382716,
"acc_stderr": 0.02577311116963045,
"acc_norm": 0.6882716049382716,
"acc_norm_stderr": 0.02577311116963045
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873866,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873866
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.44198174706649285,
"acc_stderr": 0.012683972513598816,
"acc_norm": 0.44198174706649285,
"acc_norm_stderr": 0.012683972513598816
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.019291961895066382,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.019291961895066382
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.582089552238806,
"acc_stderr": 0.034875586404620636,
"acc_norm": 0.582089552238806,
"acc_norm_stderr": 0.034875586404620636
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5862913096695227,
"mc1_stderr": 0.0172408618120998,
"mc2": 0.729182943288448,
"mc2_stderr": 0.01463892425987301
},
"harness|winogrande|5": {
"acc": 0.8129439621152328,
"acc_stderr": 0.010959716435242912
},
"harness|gsm8k|5": {
"acc": 0.5686125852918877,
"acc_stderr": 0.013642195352511563
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
maomlab/HematoxLong2023 | maomlab | "2024-10-09T18:50:17Z" | 0 | 0 | [
"task_categories:tabular-classification",
"language:en",
"license:mit",
"size_categories:1K<n<10K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"chemistry",
"chemical information"
] | [
"tabular-classification"
] | "2024-03-10T00:58:16Z" | ---
license: mit
language: en
tags:
- chemistry
- chemical information
task_categories:
- tabular-classification
pretty_name: Hematotoxicity Dataset
dataset_summary: >-
The hematotoxicity dataset consists of a training set with 1788 molecules and
a test set with 594 molecules. The train and test datasets were created after
sanitizing and splitting the original dataset in the paper below.
citation: |-
@article{,
author = {Teng-Zhi Long, Shao-Hua Shi, Shao Liu, Ai-Ping Lu, Zhao-Qian Liu, Min Li, Ting-Jun Hou*, and Dong-Sheng Cao},
doi = {10.1021/acs.jcim.2c01088},
journal = {Journal of Chemical Information and Modeling},
number = {1},
title = {Structural Analysis and Prediction of Hematotoxicity Using Deep Learning Approaches},
volume = {63},
year = {2023},
url = {https://pubs.acs.org/doi/10.1021/acs.jcim.2c01088},
publisher = {ACS publications}
}
size_categories:
- 1K<n<10K
config_names:
- HematoxLong2023
configs:
- config_name: HematoxLong2023
data_files:
- split: test
path: HematoxLong2023/test.csv
- split: train
path: HematoxLong2023/train.csv
dataset_info:
- config_name: HematoxLong2023
features:
- name: "SMILES"
dtype: string
- name: "Label"
dtype:
class_label:
names:
0: "negative"
1: "positive"
splits:
- name: train
num_bytes: 28736
num_examples: 1788
- name: test
num_bytes: 9632
num_examples: 594
---
# Hematotoxicity Dataset (HematoxLong2023)
A hematotoxicity dataset containing 1772 chemicals was obtained, which includes a positive set with 589 molecules and a negative set with 1183 molecules.
The molecules were divided into a training set of 1330 molecules and a test set of 442 molecules according to their Murcko scaffolds.
Additionally, 610 new molecules from related research and databases were compiled as the external validation set.
The train and test datasets uploaded to our Hugging Face repository have been sanitized and split from the original dataset, which contains 2382 molecules.
If you would like to try these processes with the original dataset, please follow the instructions in the [Preprocessing Script.py](https://huggingface.co/datasets/maomlab/HematoxLong2023/blob/main/Preprocessing%20Script.py) file located in the HematoxLong2023.
## Quickstart Usage
### Load a dataset in python
Each subset can be loaded into python using the Huggingface [datasets](https://huggingface.co/docs/datasets/index) library.
First, from the command line install the `datasets` library
$ pip install datasets
then, from within python load the datasets library
>>> import datasets
and load one of the `HematoxLong2023` datasets, e.g.,
>>> HematoxLong2023 = datasets.load_dataset("maomlab/HematoxLong2023", name = "HematoxLong2023")
Downloading readme: 100%|██████████| 5.23k/5.23k [00:00<00:00, 35.1kkB/s]
Downloading data: 100%|██████████| 34.5k//34.5k/ [00:00<00:00, 155kB/s]
Downloading data: 100%|██████████| 97.1k/97.1k [00:00<00:00, 587kB/s]
Generating test split: 100%|██████████| 594/594 [00:00<00:00, 12705.92 examples/s]
Generating train split: 100%|██████████| 1788/1788 [00:00<00:00, 43895.91 examples/s]
and inspecting the loaded dataset
>>> HematoxLong2023
HematoxLong2023
DatasetDict({
test: Dataset({
features: ['SMILES', 'label'],
num_rows: 594
})
train: Dataset({
features: ['SMILES', 'label'],
num_rows: 1788
})
})
### Use a dataset to train a model
One way to use the dataset is through the [MolFlux](https://exscientia.github.io/molflux/) package developed by Exscientia.
First, from the command line, install `MolFlux` library with `catboost` and `rdkit` support
pip install 'molflux[catboost,rdkit]'
then load, featurize, split, fit, and evaluate the catboost model
import json
from datasets import load_dataset
from molflux.datasets import featurise_dataset
from molflux.features import load_from_dicts as load_representations_from_dicts
from molflux.splits import load_from_dict as load_split_from_dict
from molflux.modelzoo import load_from_dict as load_model_from_dict
from molflux.metrics import load_suite
Split and evaluate the catboost model
split_dataset = load_dataset('maomlab/HematoxLong2023', name = 'HematoxLong2023')
split_featurised_dataset = featurise_dataset(
split_dataset,
column = "SMILES",
representations = load_representations_from_dicts([{"name": "morgan"}, {"name": "maccs_rdkit"}]))
model = load_model_from_dict({
"name": "cat_boost_classifier",
"config": {
"x_features": ['SMILES::morgan', 'SMILES::maccs_rdkit'],
"y_features": ['Label']}})
model.train(split_featurised_dataset["train"])
preds = model.predict(split_featurised_dataset["test"])
classification_suite = load_suite("classification")
scores = classification_suite.compute(
references=split_featurised_dataset["test"]['Label'],
predictions=preds["cat_boost_classifier::Label"])
## Citation
Cite this:
J. Chem. Inf. Model. 2023, 63, 1, 111–125
Publication Date:December 6, 2022
https://doi.org/10.1021/acs.jcim.2c01088
Copyright © 2024 American Chemical Society |
open-llm-leaderboard-old/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1 | open-llm-leaderboard-old | "2024-03-10T01:00:49Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T00:59:28Z" | ---
pretty_name: Evaluation run of azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:57:08.636734](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1/blob/main/results_2024-03-10T00-57-08.636734.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.607447095635537,\n\
\ \"acc_stderr\": 0.03314052014839398,\n \"acc_norm\": 0.6119347527420224,\n\
\ \"acc_norm_stderr\": 0.033811338894945774,\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6822484423368418,\n\
\ \"mc2_stderr\": 0.015197767693951841\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5887372013651877,\n \"acc_stderr\": 0.014379441068522085,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491888\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6681935869348735,\n\
\ \"acc_stderr\": 0.004698995789478832,\n \"acc_norm\": 0.8484365664210317,\n\
\ \"acc_norm_stderr\": 0.003578643387547847\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.625,\n \"acc_stderr\": 0.039397364351956274,\n \
\ \"acc_norm\": 0.625,\n \"acc_norm_stderr\": 0.039397364351956274\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6944444444444444,\n\
\ \"acc_stderr\": 0.03852084696008534,\n \"acc_norm\": 0.6944444444444444,\n\
\ \"acc_norm_stderr\": 0.03852084696008534\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.53,\n\
\ \"acc_stderr\": 0.050161355804659205,\n \"acc_norm\": 0.53,\n \
\ \"acc_norm_stderr\": 0.050161355804659205\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5838150289017341,\n\
\ \"acc_stderr\": 0.03758517775404948,\n \"acc_norm\": 0.5838150289017341,\n\
\ \"acc_norm_stderr\": 0.03758517775404948\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4215686274509804,\n \"acc_stderr\": 0.04913595201274498,\n\
\ \"acc_norm\": 0.4215686274509804,\n \"acc_norm_stderr\": 0.04913595201274498\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5319148936170213,\n \"acc_stderr\": 0.03261936918467382,\n\
\ \"acc_norm\": 0.5319148936170213,\n \"acc_norm_stderr\": 0.03261936918467382\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.40350877192982454,\n\
\ \"acc_stderr\": 0.04615186962583703,\n \"acc_norm\": 0.40350877192982454,\n\
\ \"acc_norm_stderr\": 0.04615186962583703\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.373015873015873,\n \"acc_stderr\": 0.02490699045899257,\n \"acc_norm\"\
: 0.373015873015873,\n \"acc_norm_stderr\": 0.02490699045899257\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04426266681379909,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04426266681379909\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.635483870967742,\n\
\ \"acc_stderr\": 0.027379871229943245,\n \"acc_norm\": 0.635483870967742,\n\
\ \"acc_norm_stderr\": 0.027379871229943245\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.65,\n \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\"\
: 0.65,\n \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7393939393939394,\n \"acc_stderr\": 0.034277431758165236,\n\
\ \"acc_norm\": 0.7393939393939394,\n \"acc_norm_stderr\": 0.034277431758165236\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198896,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198896\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306443,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5564102564102564,\n \"acc_stderr\": 0.025189149894764205,\n\
\ \"acc_norm\": 0.5564102564102564,\n \"acc_norm_stderr\": 0.025189149894764205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3074074074074074,\n \"acc_stderr\": 0.02813325257881563,\n \
\ \"acc_norm\": 0.3074074074074074,\n \"acc_norm_stderr\": 0.02813325257881563\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343233,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343233\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145624,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145624\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.042365112580946336,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.042365112580946336\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.047184714852195886,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.047184714852195886\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597552,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597552\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252609,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.04725815626252609\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7803320561941252,\n\
\ \"acc_stderr\": 0.014805384478371155,\n \"acc_norm\": 0.7803320561941252,\n\
\ \"acc_norm_stderr\": 0.014805384478371155\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6965317919075145,\n \"acc_stderr\": 0.024752411960917205,\n\
\ \"acc_norm\": 0.6965317919075145,\n \"acc_norm_stderr\": 0.024752411960917205\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.31620111731843575,\n\
\ \"acc_stderr\": 0.015551673652172547,\n \"acc_norm\": 0.31620111731843575,\n\
\ \"acc_norm_stderr\": 0.015551673652172547\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6797385620915033,\n \"acc_stderr\": 0.02671611838015685,\n\
\ \"acc_norm\": 0.6797385620915033,\n \"acc_norm_stderr\": 0.02671611838015685\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7006172839506173,\n \"acc_stderr\": 0.02548311560119546,\n\
\ \"acc_norm\": 0.7006172839506173,\n \"acc_norm_stderr\": 0.02548311560119546\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.43285528031290743,\n\
\ \"acc_stderr\": 0.012654565234622868,\n \"acc_norm\": 0.43285528031290743,\n\
\ \"acc_norm_stderr\": 0.012654565234622868\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6139705882352942,\n \"acc_stderr\": 0.029573269134411124,\n\
\ \"acc_norm\": 0.6139705882352942,\n \"acc_norm_stderr\": 0.029573269134411124\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6339869281045751,\n \"acc_stderr\": 0.019488025745529675,\n \
\ \"acc_norm\": 0.6339869281045751,\n \"acc_norm_stderr\": 0.019488025745529675\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7090909090909091,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.7090909090909091,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.02916273841024977,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.02916273841024977\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.736318407960199,\n\
\ \"acc_stderr\": 0.03115715086935557,\n \"acc_norm\": 0.736318407960199,\n\
\ \"acc_norm_stderr\": 0.03115715086935557\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333047,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333047\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5287637698898409,\n\
\ \"mc1_stderr\": 0.017474513848525518,\n \"mc2\": 0.6822484423368418,\n\
\ \"mc2_stderr\": 0.015197767693951841\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.771112865035517,\n \"acc_stderr\": 0.01180736022402539\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.40106141015921154,\n \
\ \"acc_stderr\": 0.013500158922245542\n }\n}\n```"
repo_url: https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-57-08.636734.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- '**/details_harness|winogrande|5_2024-03-10T00-57-08.636734.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-57-08.636734.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_57_08.636734
path:
- results_2024-03-10T00-57-08.636734.parquet
- split: latest
path:
- results_2024-03-10T00-57-08.636734.parquet
---
# Dataset Card for Evaluation run of azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1](https://huggingface.co/azarafrooz/Mistral-7B-Instruct-v2-sp-v0.1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:57:08.636734](https://huggingface.co/datasets/open-llm-leaderboard/details_azarafrooz__Mistral-7B-Instruct-v2-sp-v0.1/blob/main/results_2024-03-10T00-57-08.636734.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.607447095635537,
"acc_stderr": 0.03314052014839398,
"acc_norm": 0.6119347527420224,
"acc_norm_stderr": 0.033811338894945774,
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6822484423368418,
"mc2_stderr": 0.015197767693951841
},
"harness|arc:challenge|25": {
"acc": 0.5887372013651877,
"acc_stderr": 0.014379441068522085,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491888
},
"harness|hellaswag|10": {
"acc": 0.6681935869348735,
"acc_stderr": 0.004698995789478832,
"acc_norm": 0.8484365664210317,
"acc_norm_stderr": 0.003578643387547847
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.625,
"acc_stderr": 0.039397364351956274,
"acc_norm": 0.625,
"acc_norm_stderr": 0.039397364351956274
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6944444444444444,
"acc_stderr": 0.03852084696008534,
"acc_norm": 0.6944444444444444,
"acc_norm_stderr": 0.03852084696008534
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.53,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5838150289017341,
"acc_stderr": 0.03758517775404948,
"acc_norm": 0.5838150289017341,
"acc_norm_stderr": 0.03758517775404948
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4215686274509804,
"acc_stderr": 0.04913595201274498,
"acc_norm": 0.4215686274509804,
"acc_norm_stderr": 0.04913595201274498
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5319148936170213,
"acc_stderr": 0.03261936918467382,
"acc_norm": 0.5319148936170213,
"acc_norm_stderr": 0.03261936918467382
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.40350877192982454,
"acc_stderr": 0.04615186962583703,
"acc_norm": 0.40350877192982454,
"acc_norm_stderr": 0.04615186962583703
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.02490699045899257,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.02490699045899257
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04426266681379909,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04426266681379909
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.635483870967742,
"acc_stderr": 0.027379871229943245,
"acc_norm": 0.635483870967742,
"acc_norm_stderr": 0.027379871229943245
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7393939393939394,
"acc_stderr": 0.034277431758165236,
"acc_norm": 0.7393939393939394,
"acc_norm_stderr": 0.034277431758165236
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198896,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198896
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306443,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5564102564102564,
"acc_stderr": 0.025189149894764205,
"acc_norm": 0.5564102564102564,
"acc_norm_stderr": 0.025189149894764205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3074074074074074,
"acc_stderr": 0.02813325257881563,
"acc_norm": 0.3074074074074074,
"acc_norm_stderr": 0.02813325257881563
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343233,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343233
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145624,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145624
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.042365112580946336,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.042365112580946336
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.047184714852195886,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.047184714852195886
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597552,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597552
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252609,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252609
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7803320561941252,
"acc_stderr": 0.014805384478371155,
"acc_norm": 0.7803320561941252,
"acc_norm_stderr": 0.014805384478371155
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6965317919075145,
"acc_stderr": 0.024752411960917205,
"acc_norm": 0.6965317919075145,
"acc_norm_stderr": 0.024752411960917205
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.31620111731843575,
"acc_stderr": 0.015551673652172547,
"acc_norm": 0.31620111731843575,
"acc_norm_stderr": 0.015551673652172547
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6797385620915033,
"acc_stderr": 0.02671611838015685,
"acc_norm": 0.6797385620915033,
"acc_norm_stderr": 0.02671611838015685
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7006172839506173,
"acc_stderr": 0.02548311560119546,
"acc_norm": 0.7006172839506173,
"acc_norm_stderr": 0.02548311560119546
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.43285528031290743,
"acc_stderr": 0.012654565234622868,
"acc_norm": 0.43285528031290743,
"acc_norm_stderr": 0.012654565234622868
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6139705882352942,
"acc_stderr": 0.029573269134411124,
"acc_norm": 0.6139705882352942,
"acc_norm_stderr": 0.029573269134411124
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6339869281045751,
"acc_stderr": 0.019488025745529675,
"acc_norm": 0.6339869281045751,
"acc_norm_stderr": 0.019488025745529675
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7090909090909091,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.7090909090909091,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.02916273841024977,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.02916273841024977
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.736318407960199,
"acc_stderr": 0.03115715086935557,
"acc_norm": 0.736318407960199,
"acc_norm_stderr": 0.03115715086935557
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333047,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333047
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5287637698898409,
"mc1_stderr": 0.017474513848525518,
"mc2": 0.6822484423368418,
"mc2_stderr": 0.015197767693951841
},
"harness|winogrande|5": {
"acc": 0.771112865035517,
"acc_stderr": 0.01180736022402539
},
"harness|gsm8k|5": {
"acc": 0.40106141015921154,
"acc_stderr": 0.013500158922245542
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction | open-llm-leaderboard-old | "2024-03-10T01:02:07Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T01:01:47Z" | ---
pretty_name: Evaluation run of Severian/Nexus-IKM-Mistral-7B-v5-instruction
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Severian/Nexus-IKM-Mistral-7B-v5-instruction](https://huggingface.co/Severian/Nexus-IKM-Mistral-7B-v5-instruction)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T00:59:27.972031](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction/blob/main/results_2024-03-10T00-59-27.972031.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2477168147645942,\n\
\ \"acc_stderr\": 0.030566707099033714,\n \"acc_norm\": 0.24811298552173527,\n\
\ \"acc_norm_stderr\": 0.031378435870979805,\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871096,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|arc:challenge|25\": {\n \"acc\"\
: 0.2363481228668942,\n \"acc_stderr\": 0.012414960524301836,\n \"\
acc_norm\": 0.2773037542662116,\n \"acc_norm_stderr\": 0.013082095839059374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.2671778530173272,\n\
\ \"acc_stderr\": 0.004415816696303075,\n \"acc_norm\": 0.2892850029874527,\n\
\ \"acc_norm_stderr\": 0.004525037849178834\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.22962962962962963,\n\
\ \"acc_stderr\": 0.036333844140734636,\n \"acc_norm\": 0.22962962962962963,\n\
\ \"acc_norm_stderr\": 0.036333844140734636\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.24342105263157895,\n \"acc_stderr\": 0.034923496688842384,\n\
\ \"acc_norm\": 0.24342105263157895,\n \"acc_norm_stderr\": 0.034923496688842384\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.21,\n\
\ \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.21,\n \
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.27169811320754716,\n \"acc_stderr\": 0.027377706624670716,\n\
\ \"acc_norm\": 0.27169811320754716,\n \"acc_norm_stderr\": 0.027377706624670716\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2152777777777778,\n\
\ \"acc_stderr\": 0.03437079344106133,\n \"acc_norm\": 0.2152777777777778,\n\
\ \"acc_norm_stderr\": 0.03437079344106133\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.21,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.21,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2658959537572254,\n\
\ \"acc_stderr\": 0.03368762932259431,\n \"acc_norm\": 0.2658959537572254,\n\
\ \"acc_norm_stderr\": 0.03368762932259431\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \"acc_norm\": 0.26,\n\
\ \"acc_norm_stderr\": 0.0440844002276808\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.23809523809523808,\n \"acc_stderr\": 0.021935878081184756,\n \"\
acc_norm\": 0.23809523809523808,\n \"acc_norm_stderr\": 0.021935878081184756\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.30952380952380953,\n\
\ \"acc_stderr\": 0.04134913018303316,\n \"acc_norm\": 0.30952380952380953,\n\
\ \"acc_norm_stderr\": 0.04134913018303316\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2838709677419355,\n\
\ \"acc_stderr\": 0.025649381063029254,\n \"acc_norm\": 0.2838709677419355,\n\
\ \"acc_norm_stderr\": 0.025649381063029254\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2512315270935961,\n \"acc_stderr\": 0.030516530732694436,\n\
\ \"acc_norm\": 0.2512315270935961,\n \"acc_norm_stderr\": 0.030516530732694436\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932269,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.04163331998932269\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24848484848484848,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.24848484848484848,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25757575757575757,\n \"acc_stderr\": 0.031156269519646836,\n \"\
acc_norm\": 0.25757575757575757,\n \"acc_norm_stderr\": 0.031156269519646836\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21243523316062177,\n \"acc_stderr\": 0.029519282616817258,\n\
\ \"acc_norm\": 0.21243523316062177,\n \"acc_norm_stderr\": 0.029519282616817258\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.02361088430892786,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2740740740740741,\n \"acc_stderr\": 0.027195934804085622,\n \
\ \"acc_norm\": 0.2740740740740741,\n \"acc_norm_stderr\": 0.027195934804085622\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2781456953642384,\n \"acc_stderr\": 0.03658603262763744,\n \"\
acc_norm\": 0.2781456953642384,\n \"acc_norm_stderr\": 0.03658603262763744\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26788990825688075,\n \"acc_stderr\": 0.018987462257978652,\n \"\
acc_norm\": 0.26788990825688075,\n \"acc_norm_stderr\": 0.018987462257978652\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2175925925925926,\n \"acc_stderr\": 0.028139689444859672,\n \"\
acc_norm\": 0.2175925925925926,\n \"acc_norm_stderr\": 0.028139689444859672\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.24019607843137256,\n \"acc_stderr\": 0.02998373305591362,\n \"\
acc_norm\": 0.24019607843137256,\n \"acc_norm_stderr\": 0.02998373305591362\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.20675105485232068,\n \"acc_stderr\": 0.0263616516683891,\n \
\ \"acc_norm\": 0.20675105485232068,\n \"acc_norm_stderr\": 0.0263616516683891\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.13901345291479822,\n\
\ \"acc_stderr\": 0.02321935283447447,\n \"acc_norm\": 0.13901345291479822,\n\
\ \"acc_norm_stderr\": 0.02321935283447447\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.18181818181818182,\n \"acc_stderr\": 0.03520893951097653,\n \"\
acc_norm\": 0.18181818181818182,\n \"acc_norm_stderr\": 0.03520893951097653\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25,\n\
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.25,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.27607361963190186,\n \"acc_stderr\": 0.0351238528370505,\n\
\ \"acc_norm\": 0.27607361963190186,\n \"acc_norm_stderr\": 0.0351238528370505\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.24107142857142858,\n\
\ \"acc_stderr\": 0.04059867246952685,\n \"acc_norm\": 0.24107142857142858,\n\
\ \"acc_norm_stderr\": 0.04059867246952685\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.30097087378640774,\n \"acc_stderr\": 0.04541609446503948,\n\
\ \"acc_norm\": 0.30097087378640774,\n \"acc_norm_stderr\": 0.04541609446503948\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.18376068376068377,\n\
\ \"acc_stderr\": 0.025372139671722933,\n \"acc_norm\": 0.18376068376068377,\n\
\ \"acc_norm_stderr\": 0.025372139671722933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.22349936143039592,\n\
\ \"acc_stderr\": 0.01489723522945071,\n \"acc_norm\": 0.22349936143039592,\n\
\ \"acc_norm_stderr\": 0.01489723522945071\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2514450867052023,\n \"acc_stderr\": 0.023357365785874037,\n\
\ \"acc_norm\": 0.2514450867052023,\n \"acc_norm_stderr\": 0.023357365785874037\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369923,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369923\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2549019607843137,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2282958199356913,\n\
\ \"acc_stderr\": 0.023839303311398215,\n \"acc_norm\": 0.2282958199356913,\n\
\ \"acc_norm_stderr\": 0.023839303311398215\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25308641975308643,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.25308641975308643,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2375886524822695,\n \"acc_stderr\": 0.025389512552729906,\n \
\ \"acc_norm\": 0.2375886524822695,\n \"acc_norm_stderr\": 0.025389512552729906\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.01092649610203496,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.01092649610203496\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.027678468642144703,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.027678468642144703\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.22712418300653595,\n \"acc_stderr\": 0.016949853279212376,\n \
\ \"acc_norm\": 0.22712418300653595,\n \"acc_norm_stderr\": 0.016949853279212376\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.24545454545454545,\n\
\ \"acc_stderr\": 0.041220665028782855,\n \"acc_norm\": 0.24545454545454545,\n\
\ \"acc_norm_stderr\": 0.041220665028782855\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2857142857142857,\n \"acc_stderr\": 0.02892058322067558,\n\
\ \"acc_norm\": 0.2857142857142857,\n \"acc_norm_stderr\": 0.02892058322067558\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.14427860696517414,\n\
\ \"acc_stderr\": 0.024845753212306042,\n \"acc_norm\": 0.14427860696517414,\n\
\ \"acc_norm_stderr\": 0.024845753212306042\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384739,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384739\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.2469879518072289,\n\
\ \"acc_stderr\": 0.03357351982064537,\n \"acc_norm\": 0.2469879518072289,\n\
\ \"acc_norm_stderr\": 0.03357351982064537\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.1695906432748538,\n \"acc_stderr\": 0.028782108105401712,\n\
\ \"acc_norm\": 0.1695906432748538,\n \"acc_norm_stderr\": 0.028782108105401712\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23623011015911874,\n\
\ \"mc1_stderr\": 0.014869755015871096,\n \"mc2\": NaN,\n \"\
mc2_stderr\": NaN\n },\n \"harness|winogrande|5\": {\n \"acc\": 0.5351223362273086,\n\
\ \"acc_stderr\": 0.014017773120881583\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/Severian/Nexus-IKM-Mistral-7B-v5-instruction
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T00-59-27.972031.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- '**/details_harness|winogrande|5_2024-03-10T00-59-27.972031.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T00-59-27.972031.parquet'
- config_name: results
data_files:
- split: 2024_03_10T00_59_27.972031
path:
- results_2024-03-10T00-59-27.972031.parquet
- split: latest
path:
- results_2024-03-10T00-59-27.972031.parquet
---
# Dataset Card for Evaluation run of Severian/Nexus-IKM-Mistral-7B-v5-instruction
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Severian/Nexus-IKM-Mistral-7B-v5-instruction](https://huggingface.co/Severian/Nexus-IKM-Mistral-7B-v5-instruction) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T00:59:27.972031](https://huggingface.co/datasets/open-llm-leaderboard/details_Severian__Nexus-IKM-Mistral-7B-v5-instruction/blob/main/results_2024-03-10T00-59-27.972031.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2477168147645942,
"acc_stderr": 0.030566707099033714,
"acc_norm": 0.24811298552173527,
"acc_norm_stderr": 0.031378435870979805,
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871096,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|arc:challenge|25": {
"acc": 0.2363481228668942,
"acc_stderr": 0.012414960524301836,
"acc_norm": 0.2773037542662116,
"acc_norm_stderr": 0.013082095839059374
},
"harness|hellaswag|10": {
"acc": 0.2671778530173272,
"acc_stderr": 0.004415816696303075,
"acc_norm": 0.2892850029874527,
"acc_norm_stderr": 0.004525037849178834
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.22962962962962963,
"acc_stderr": 0.036333844140734636,
"acc_norm": 0.22962962962962963,
"acc_norm_stderr": 0.036333844140734636
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.24342105263157895,
"acc_stderr": 0.034923496688842384,
"acc_norm": 0.24342105263157895,
"acc_norm_stderr": 0.034923496688842384
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.27169811320754716,
"acc_stderr": 0.027377706624670716,
"acc_norm": 0.27169811320754716,
"acc_norm_stderr": 0.027377706624670716
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2152777777777778,
"acc_stderr": 0.03437079344106133,
"acc_norm": 0.2152777777777778,
"acc_norm_stderr": 0.03437079344106133
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.21,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2658959537572254,
"acc_stderr": 0.03368762932259431,
"acc_norm": 0.2658959537572254,
"acc_norm_stderr": 0.03368762932259431
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.0281854413012341,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.0281854413012341
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.23809523809523808,
"acc_stderr": 0.021935878081184756,
"acc_norm": 0.23809523809523808,
"acc_norm_stderr": 0.021935878081184756
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.30952380952380953,
"acc_stderr": 0.04134913018303316,
"acc_norm": 0.30952380952380953,
"acc_norm_stderr": 0.04134913018303316
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2838709677419355,
"acc_stderr": 0.025649381063029254,
"acc_norm": 0.2838709677419355,
"acc_norm_stderr": 0.025649381063029254
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2512315270935961,
"acc_stderr": 0.030516530732694436,
"acc_norm": 0.2512315270935961,
"acc_norm_stderr": 0.030516530732694436
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932269,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932269
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24848484848484848,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.24848484848484848,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25757575757575757,
"acc_stderr": 0.031156269519646836,
"acc_norm": 0.25757575757575757,
"acc_norm_stderr": 0.031156269519646836
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21243523316062177,
"acc_stderr": 0.029519282616817258,
"acc_norm": 0.21243523316062177,
"acc_norm_stderr": 0.029519282616817258
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.027195934804085622,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.027195934804085622
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2781456953642384,
"acc_stderr": 0.03658603262763744,
"acc_norm": 0.2781456953642384,
"acc_norm_stderr": 0.03658603262763744
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26788990825688075,
"acc_stderr": 0.018987462257978652,
"acc_norm": 0.26788990825688075,
"acc_norm_stderr": 0.018987462257978652
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2175925925925926,
"acc_stderr": 0.028139689444859672,
"acc_norm": 0.2175925925925926,
"acc_norm_stderr": 0.028139689444859672
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.24019607843137256,
"acc_stderr": 0.02998373305591362,
"acc_norm": 0.24019607843137256,
"acc_norm_stderr": 0.02998373305591362
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.20675105485232068,
"acc_stderr": 0.0263616516683891,
"acc_norm": 0.20675105485232068,
"acc_norm_stderr": 0.0263616516683891
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.13901345291479822,
"acc_stderr": 0.02321935283447447,
"acc_norm": 0.13901345291479822,
"acc_norm_stderr": 0.02321935283447447
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.18181818181818182,
"acc_stderr": 0.03520893951097653,
"acc_norm": 0.18181818181818182,
"acc_norm_stderr": 0.03520893951097653
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.27607361963190186,
"acc_stderr": 0.0351238528370505,
"acc_norm": 0.27607361963190186,
"acc_norm_stderr": 0.0351238528370505
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.24107142857142858,
"acc_stderr": 0.04059867246952685,
"acc_norm": 0.24107142857142858,
"acc_norm_stderr": 0.04059867246952685
},
"harness|hendrycksTest-management|5": {
"acc": 0.30097087378640774,
"acc_stderr": 0.04541609446503948,
"acc_norm": 0.30097087378640774,
"acc_norm_stderr": 0.04541609446503948
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.18376068376068377,
"acc_stderr": 0.025372139671722933,
"acc_norm": 0.18376068376068377,
"acc_norm_stderr": 0.025372139671722933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.22349936143039592,
"acc_stderr": 0.01489723522945071,
"acc_norm": 0.22349936143039592,
"acc_norm_stderr": 0.01489723522945071
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2514450867052023,
"acc_stderr": 0.023357365785874037,
"acc_norm": 0.2514450867052023,
"acc_norm_stderr": 0.023357365785874037
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369923,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2282958199356913,
"acc_stderr": 0.023839303311398215,
"acc_norm": 0.2282958199356913,
"acc_norm_stderr": 0.023839303311398215
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25308641975308643,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.25308641975308643,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2375886524822695,
"acc_stderr": 0.025389512552729906,
"acc_norm": 0.2375886524822695,
"acc_norm_stderr": 0.025389512552729906
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.01092649610203496,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.01092649610203496
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.027678468642144703,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.027678468642144703
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.22712418300653595,
"acc_stderr": 0.016949853279212376,
"acc_norm": 0.22712418300653595,
"acc_norm_stderr": 0.016949853279212376
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.041220665028782855,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.041220665028782855
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2857142857142857,
"acc_stderr": 0.02892058322067558,
"acc_norm": 0.2857142857142857,
"acc_norm_stderr": 0.02892058322067558
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.14427860696517414,
"acc_stderr": 0.024845753212306042,
"acc_norm": 0.14427860696517414,
"acc_norm_stderr": 0.024845753212306042
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384739,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384739
},
"harness|hendrycksTest-virology|5": {
"acc": 0.2469879518072289,
"acc_stderr": 0.03357351982064537,
"acc_norm": 0.2469879518072289,
"acc_norm_stderr": 0.03357351982064537
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.1695906432748538,
"acc_stderr": 0.028782108105401712,
"acc_norm": 0.1695906432748538,
"acc_norm_stderr": 0.028782108105401712
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23623011015911874,
"mc1_stderr": 0.014869755015871096,
"mc2": NaN,
"mc2_stderr": NaN
},
"harness|winogrande|5": {
"acc": 0.5351223362273086,
"acc_stderr": 0.014017773120881583
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_Eric111__Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1 | open-llm-leaderboard-old | "2024-03-10T01:07:37Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T01:07:17Z" | ---
pretty_name: Evaluation run of Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1](https://huggingface.co/Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T01:04:59.662363](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1/blob/main/results_2024-03-10T01-04-59.662363.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6223443674693071,\n\
\ \"acc_stderr\": 0.032764128357652716,\n \"acc_norm\": 0.6244939574255076,\n\
\ \"acc_norm_stderr\": 0.033425434862231436,\n \"mc1\": 0.5091799265605875,\n\
\ \"mc1_stderr\": 0.017500550724819753,\n \"mc2\": 0.6564309371860594,\n\
\ \"mc2_stderr\": 0.0155396681594194\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6348122866894198,\n \"acc_stderr\": 0.014070265519268804,\n\
\ \"acc_norm\": 0.6783276450511946,\n \"acc_norm_stderr\": 0.013650488084494162\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6782513443537144,\n\
\ \"acc_stderr\": 0.0046619243147560906,\n \"acc_norm\": 0.8593905596494722,\n\
\ \"acc_norm_stderr\": 0.003469077847056388\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7171052631578947,\n \"acc_stderr\": 0.03665349695640767,\n\
\ \"acc_norm\": 0.7171052631578947,\n \"acc_norm_stderr\": 0.03665349695640767\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.6,\n\
\ \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6754716981132075,\n \"acc_stderr\": 0.02881561571343211,\n\
\ \"acc_norm\": 0.6754716981132075,\n \"acc_norm_stderr\": 0.02881561571343211\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.52,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6242774566473989,\n\
\ \"acc_stderr\": 0.036928207672648664,\n \"acc_norm\": 0.6242774566473989,\n\
\ \"acc_norm_stderr\": 0.036928207672648664\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.045126085985421276\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5148936170212766,\n \"acc_stderr\": 0.03267151848924777,\n\
\ \"acc_norm\": 0.5148936170212766,\n \"acc_norm_stderr\": 0.03267151848924777\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.38596491228070173,\n\
\ \"acc_stderr\": 0.04579639422070434,\n \"acc_norm\": 0.38596491228070173,\n\
\ \"acc_norm_stderr\": 0.04579639422070434\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5793103448275863,\n \"acc_stderr\": 0.0411391498118926,\n\
\ \"acc_norm\": 0.5793103448275863,\n \"acc_norm_stderr\": 0.0411391498118926\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924003,\n \"\
acc_norm\": 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6516129032258065,\n\
\ \"acc_stderr\": 0.02710482632810094,\n \"acc_norm\": 0.6516129032258065,\n\
\ \"acc_norm_stderr\": 0.02710482632810094\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876105,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\"\
: 0.61,\n \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.029857515673386424,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.029857515673386424\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.02463978909770944,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.02463978909770944\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6102564102564103,\n \"acc_stderr\": 0.024726967886647074,\n\
\ \"acc_norm\": 0.6102564102564103,\n \"acc_norm_stderr\": 0.024726967886647074\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3148148148148148,\n \"acc_stderr\": 0.02831753349606648,\n \
\ \"acc_norm\": 0.3148148148148148,\n \"acc_norm_stderr\": 0.02831753349606648\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.031041941304059285,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.031041941304059285\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.038227469376587525,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.038227469376587525\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8165137614678899,\n \"acc_stderr\": 0.016595259710399303,\n \"\
acc_norm\": 0.8165137614678899,\n \"acc_norm_stderr\": 0.016595259710399303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4583333333333333,\n \"acc_stderr\": 0.033981108902946366,\n \"\
acc_norm\": 0.4583333333333333,\n \"acc_norm_stderr\": 0.033981108902946366\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7598039215686274,\n \"acc_stderr\": 0.02998373305591361,\n \"\
acc_norm\": 0.7598039215686274,\n \"acc_norm_stderr\": 0.02998373305591361\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.026558372502661916,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.026558372502661916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8347107438016529,\n \"acc_stderr\": 0.033907806129727755,\n \"\
acc_norm\": 0.8347107438016529,\n \"acc_norm_stderr\": 0.033907806129727755\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7361963190184049,\n \"acc_stderr\": 0.03462419931615623,\n\
\ \"acc_norm\": 0.7361963190184049,\n \"acc_norm_stderr\": 0.03462419931615623\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8045977011494253,\n\
\ \"acc_stderr\": 0.014179171373424384,\n \"acc_norm\": 0.8045977011494253,\n\
\ \"acc_norm_stderr\": 0.014179171373424384\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7167630057803468,\n \"acc_stderr\": 0.024257901705323374,\n\
\ \"acc_norm\": 0.7167630057803468,\n \"acc_norm_stderr\": 0.024257901705323374\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.5039106145251396,\n\
\ \"acc_stderr\": 0.016721990073156657,\n \"acc_norm\": 0.5039106145251396,\n\
\ \"acc_norm_stderr\": 0.016721990073156657\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6862745098039216,\n \"acc_stderr\": 0.02656892101545714,\n\
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.02656892101545714\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.02616058445014045,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.02616058445014045\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.02474862449053737,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.02474862449053737\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46153846153846156,\n\
\ \"acc_stderr\": 0.012732398286190438,\n \"acc_norm\": 0.46153846153846156,\n\
\ \"acc_norm_stderr\": 0.012732398286190438\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6323529411764706,\n \"acc_stderr\": 0.029289413409403192,\n\
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.029289413409403192\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553697,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553697\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.0282638899437846,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.0282638899437846\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.6119402985074627,\n\
\ \"acc_stderr\": 0.034457899643627506,\n \"acc_norm\": 0.6119402985074627,\n\
\ \"acc_norm_stderr\": 0.034457899643627506\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.03891364495835817,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.03891364495835817\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160882,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160882\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5091799265605875,\n\
\ \"mc1_stderr\": 0.017500550724819753,\n \"mc2\": 0.6564309371860594,\n\
\ \"mc2_stderr\": 0.0155396681594194\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8034727703235991,\n \"acc_stderr\": 0.011168120593569563\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5390447308567097,\n \
\ \"acc_stderr\": 0.013730428449116346\n }\n}\n```"
repo_url: https://huggingface.co/Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-04-59.662363.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-04-59.662363.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- '**/details_harness|winogrande|5_2024-03-10T01-04-59.662363.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T01-04-59.662363.parquet'
- config_name: results
data_files:
- split: 2024_03_10T01_04_59.662363
path:
- results_2024-03-10T01-04-59.662363.parquet
- split: latest
path:
- results_2024-03-10T01-04-59.662363.parquet
---
# Dataset Card for Evaluation run of Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1](https://huggingface.co/Eric111/Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T01:04:59.662363](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__Mistral-7B-Instruct_v0.2_UNA-TheBeagle-7b-v1/blob/main/results_2024-03-10T01-04-59.662363.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6223443674693071,
"acc_stderr": 0.032764128357652716,
"acc_norm": 0.6244939574255076,
"acc_norm_stderr": 0.033425434862231436,
"mc1": 0.5091799265605875,
"mc1_stderr": 0.017500550724819753,
"mc2": 0.6564309371860594,
"mc2_stderr": 0.0155396681594194
},
"harness|arc:challenge|25": {
"acc": 0.6348122866894198,
"acc_stderr": 0.014070265519268804,
"acc_norm": 0.6783276450511946,
"acc_norm_stderr": 0.013650488084494162
},
"harness|hellaswag|10": {
"acc": 0.6782513443537144,
"acc_stderr": 0.0046619243147560906,
"acc_norm": 0.8593905596494722,
"acc_norm_stderr": 0.003469077847056388
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7171052631578947,
"acc_stderr": 0.03665349695640767,
"acc_norm": 0.7171052631578947,
"acc_norm_stderr": 0.03665349695640767
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.6,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6754716981132075,
"acc_stderr": 0.02881561571343211,
"acc_norm": 0.6754716981132075,
"acc_norm_stderr": 0.02881561571343211
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6242774566473989,
"acc_stderr": 0.036928207672648664,
"acc_norm": 0.6242774566473989,
"acc_norm_stderr": 0.036928207672648664
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5148936170212766,
"acc_stderr": 0.03267151848924777,
"acc_norm": 0.5148936170212766,
"acc_norm_stderr": 0.03267151848924777
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.38596491228070173,
"acc_stderr": 0.04579639422070434,
"acc_norm": 0.38596491228070173,
"acc_norm_stderr": 0.04579639422070434
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5793103448275863,
"acc_stderr": 0.0411391498118926,
"acc_norm": 0.5793103448275863,
"acc_norm_stderr": 0.0411391498118926
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924003,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6516129032258065,
"acc_stderr": 0.02710482632810094,
"acc_norm": 0.6516129032258065,
"acc_norm_stderr": 0.02710482632810094
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876105,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.029857515673386424,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.029857515673386424
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.02463978909770944,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.02463978909770944
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6102564102564103,
"acc_stderr": 0.024726967886647074,
"acc_norm": 0.6102564102564103,
"acc_norm_stderr": 0.024726967886647074
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.02831753349606648,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.02831753349606648
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.031041941304059285,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.031041941304059285
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.038227469376587525,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.038227469376587525
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8165137614678899,
"acc_stderr": 0.016595259710399303,
"acc_norm": 0.8165137614678899,
"acc_norm_stderr": 0.016595259710399303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4583333333333333,
"acc_stderr": 0.033981108902946366,
"acc_norm": 0.4583333333333333,
"acc_norm_stderr": 0.033981108902946366
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7598039215686274,
"acc_stderr": 0.02998373305591361,
"acc_norm": 0.7598039215686274,
"acc_norm_stderr": 0.02998373305591361
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.026558372502661916,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.026558372502661916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8347107438016529,
"acc_stderr": 0.033907806129727755,
"acc_norm": 0.8347107438016529,
"acc_norm_stderr": 0.033907806129727755
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7361963190184049,
"acc_stderr": 0.03462419931615623,
"acc_norm": 0.7361963190184049,
"acc_norm_stderr": 0.03462419931615623
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8045977011494253,
"acc_stderr": 0.014179171373424384,
"acc_norm": 0.8045977011494253,
"acc_norm_stderr": 0.014179171373424384
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7167630057803468,
"acc_stderr": 0.024257901705323374,
"acc_norm": 0.7167630057803468,
"acc_norm_stderr": 0.024257901705323374
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.5039106145251396,
"acc_stderr": 0.016721990073156657,
"acc_norm": 0.5039106145251396,
"acc_norm_stderr": 0.016721990073156657
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.02656892101545714,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.02656892101545714
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.02616058445014045,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.02616058445014045
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.02474862449053737,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.02474862449053737
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46153846153846156,
"acc_stderr": 0.012732398286190438,
"acc_norm": 0.46153846153846156,
"acc_norm_stderr": 0.012732398286190438
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.029289413409403192,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.029289413409403192
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553697,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553697
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.0282638899437846,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.0282638899437846
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.6119402985074627,
"acc_stderr": 0.034457899643627506,
"acc_norm": 0.6119402985074627,
"acc_norm_stderr": 0.034457899643627506
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.03891364495835817,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.03891364495835817
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160882,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160882
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5091799265605875,
"mc1_stderr": 0.017500550724819753,
"mc2": 0.6564309371860594,
"mc2_stderr": 0.0155396681594194
},
"harness|winogrande|5": {
"acc": 0.8034727703235991,
"acc_stderr": 0.011168120593569563
},
"harness|gsm8k|5": {
"acc": 0.5390447308567097,
"acc_stderr": 0.013730428449116346
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_harshitv804__MetaMath-Mistral-2x7B | open-llm-leaderboard-old | "2024-03-10T01:08:24Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T01:08:03Z" | ---
pretty_name: Evaluation run of harshitv804/MetaMath-Mistral-2x7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [harshitv804/MetaMath-Mistral-2x7B](https://huggingface.co/harshitv804/MetaMath-Mistral-2x7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_harshitv804__MetaMath-Mistral-2x7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T01:05:45.632321](https://huggingface.co/datasets/open-llm-leaderboard/details_harshitv804__MetaMath-Mistral-2x7B/blob/main/results_2024-03-10T01-05-45.632321.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6218089799272568,\n\
\ \"acc_stderr\": 0.03263681999096668,\n \"acc_norm\": 0.6219459868041436,\n\
\ \"acc_norm_stderr\": 0.03330427342622862,\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.4479737874141746,\n\
\ \"mc2_stderr\": 0.015466809789155087\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5708191126279863,\n \"acc_stderr\": 0.014464085894870653,\n\
\ \"acc_norm\": 0.60580204778157,\n \"acc_norm_stderr\": 0.014280522667467325\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6441943835889266,\n\
\ \"acc_stderr\": 0.004777782584817784,\n \"acc_norm\": 0.8259310894244174,\n\
\ \"acc_norm_stderr\": 0.003783938150151617\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.35294117647058826,\n \"acc_stderr\": 0.04755129616062946,\n\
\ \"acc_norm\": 0.35294117647058826,\n \"acc_norm_stderr\": 0.04755129616062946\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5787234042553191,\n \"acc_stderr\": 0.03227834510146267,\n\
\ \"acc_norm\": 0.5787234042553191,\n \"acc_norm_stderr\": 0.03227834510146267\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067877,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067877\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.042857142857142816,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.042857142857142816\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7258064516129032,\n\
\ \"acc_stderr\": 0.025378139970885196,\n \"acc_norm\": 0.7258064516129032,\n\
\ \"acc_norm_stderr\": 0.025378139970885196\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7454545454545455,\n \"acc_stderr\": 0.03401506715249039,\n\
\ \"acc_norm\": 0.7454545454545455,\n \"acc_norm_stderr\": 0.03401506715249039\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7575757575757576,\n \"acc_stderr\": 0.030532892233932022,\n \"\
acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.030532892233932022\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8549222797927462,\n \"acc_stderr\": 0.025416343096306433,\n\
\ \"acc_norm\": 0.8549222797927462,\n \"acc_norm_stderr\": 0.025416343096306433\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6051282051282051,\n \"acc_stderr\": 0.0247843169421564,\n \
\ \"acc_norm\": 0.6051282051282051,\n \"acc_norm_stderr\": 0.0247843169421564\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465073,\n \
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465073\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.634453781512605,\n \"acc_stderr\": 0.031282177063684614,\n \
\ \"acc_norm\": 0.634453781512605,\n \"acc_norm_stderr\": 0.031282177063684614\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8073394495412844,\n \"acc_stderr\": 0.016909276884936073,\n \"\
acc_norm\": 0.8073394495412844,\n \"acc_norm_stderr\": 0.016909276884936073\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4398148148148148,\n \"acc_stderr\": 0.03385177976044811,\n \"\
acc_norm\": 0.4398148148148148,\n \"acc_norm_stderr\": 0.03385177976044811\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7941176470588235,\n \"acc_stderr\": 0.028379449451588663,\n \"\
acc_norm\": 0.7941176470588235,\n \"acc_norm_stderr\": 0.028379449451588663\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7552742616033755,\n \"acc_stderr\": 0.027985699387036423,\n \
\ \"acc_norm\": 0.7552742616033755,\n \"acc_norm_stderr\": 0.027985699387036423\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228733,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228733\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8888888888888888,\n\
\ \"acc_stderr\": 0.020588491316092368,\n \"acc_norm\": 0.8888888888888888,\n\
\ \"acc_norm_stderr\": 0.020588491316092368\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7969348659003831,\n\
\ \"acc_stderr\": 0.01438552507661157,\n \"acc_norm\": 0.7969348659003831,\n\
\ \"acc_norm_stderr\": 0.01438552507661157\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.024476994076247326,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.024476994076247326\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35083798882681566,\n\
\ \"acc_stderr\": 0.015961036675230952,\n \"acc_norm\": 0.35083798882681566,\n\
\ \"acc_norm_stderr\": 0.015961036675230952\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.025058503316958147,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.025058503316958147\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.684887459807074,\n\
\ \"acc_stderr\": 0.02638527370346449,\n \"acc_norm\": 0.684887459807074,\n\
\ \"acc_norm_stderr\": 0.02638527370346449\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7037037037037037,\n \"acc_stderr\": 0.025407197798890162,\n\
\ \"acc_norm\": 0.7037037037037037,\n \"acc_norm_stderr\": 0.025407197798890162\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4498044328552803,\n\
\ \"acc_stderr\": 0.012705721498565109,\n \"acc_norm\": 0.4498044328552803,\n\
\ \"acc_norm_stderr\": 0.012705721498565109\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6066176470588235,\n \"acc_stderr\": 0.029674288281311155,\n\
\ \"acc_norm\": 0.6066176470588235,\n \"acc_norm_stderr\": 0.029674288281311155\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.019353360547553697,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.019353360547553697\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6816326530612244,\n \"acc_stderr\": 0.029822533793982066,\n\
\ \"acc_norm\": 0.6816326530612244,\n \"acc_norm_stderr\": 0.029822533793982066\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623326,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623326\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774711,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774711\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8128654970760234,\n \"acc_stderr\": 0.029913127232368043,\n\
\ \"acc_norm\": 0.8128654970760234,\n \"acc_norm_stderr\": 0.029913127232368043\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3023255813953488,\n\
\ \"mc1_stderr\": 0.016077509266133026,\n \"mc2\": 0.4479737874141746,\n\
\ \"mc2_stderr\": 0.015466809789155087\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7600631412786109,\n \"acc_stderr\": 0.01200207862948574\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6921910538286581,\n \
\ \"acc_stderr\": 0.01271440100992365\n }\n}\n```"
repo_url: https://huggingface.co/harshitv804/MetaMath-Mistral-2x7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-05-45.632321.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-05-45.632321.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- '**/details_harness|winogrande|5_2024-03-10T01-05-45.632321.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T01-05-45.632321.parquet'
- config_name: results
data_files:
- split: 2024_03_10T01_05_45.632321
path:
- results_2024-03-10T01-05-45.632321.parquet
- split: latest
path:
- results_2024-03-10T01-05-45.632321.parquet
---
# Dataset Card for Evaluation run of harshitv804/MetaMath-Mistral-2x7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [harshitv804/MetaMath-Mistral-2x7B](https://huggingface.co/harshitv804/MetaMath-Mistral-2x7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_harshitv804__MetaMath-Mistral-2x7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T01:05:45.632321](https://huggingface.co/datasets/open-llm-leaderboard/details_harshitv804__MetaMath-Mistral-2x7B/blob/main/results_2024-03-10T01-05-45.632321.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6218089799272568,
"acc_stderr": 0.03263681999096668,
"acc_norm": 0.6219459868041436,
"acc_norm_stderr": 0.03330427342622862,
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.4479737874141746,
"mc2_stderr": 0.015466809789155087
},
"harness|arc:challenge|25": {
"acc": 0.5708191126279863,
"acc_stderr": 0.014464085894870653,
"acc_norm": 0.60580204778157,
"acc_norm_stderr": 0.014280522667467325
},
"harness|hellaswag|10": {
"acc": 0.6441943835889266,
"acc_stderr": 0.004777782584817784,
"acc_norm": 0.8259310894244174,
"acc_norm_stderr": 0.003783938150151617
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.35294117647058826,
"acc_stderr": 0.04755129616062946,
"acc_norm": 0.35294117647058826,
"acc_norm_stderr": 0.04755129616062946
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5787234042553191,
"acc_stderr": 0.03227834510146267,
"acc_norm": 0.5787234042553191,
"acc_norm_stderr": 0.03227834510146267
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067877,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067877
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.042857142857142816,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.042857142857142816
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7258064516129032,
"acc_stderr": 0.025378139970885196,
"acc_norm": 0.7258064516129032,
"acc_norm_stderr": 0.025378139970885196
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7454545454545455,
"acc_stderr": 0.03401506715249039,
"acc_norm": 0.7454545454545455,
"acc_norm_stderr": 0.03401506715249039
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.030532892233932022,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.030532892233932022
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8549222797927462,
"acc_stderr": 0.025416343096306433,
"acc_norm": 0.8549222797927462,
"acc_norm_stderr": 0.025416343096306433
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6051282051282051,
"acc_stderr": 0.0247843169421564,
"acc_norm": 0.6051282051282051,
"acc_norm_stderr": 0.0247843169421564
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465073,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465073
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.634453781512605,
"acc_stderr": 0.031282177063684614,
"acc_norm": 0.634453781512605,
"acc_norm_stderr": 0.031282177063684614
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8073394495412844,
"acc_stderr": 0.016909276884936073,
"acc_norm": 0.8073394495412844,
"acc_norm_stderr": 0.016909276884936073
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4398148148148148,
"acc_stderr": 0.03385177976044811,
"acc_norm": 0.4398148148148148,
"acc_norm_stderr": 0.03385177976044811
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7941176470588235,
"acc_stderr": 0.028379449451588663,
"acc_norm": 0.7941176470588235,
"acc_norm_stderr": 0.028379449451588663
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7552742616033755,
"acc_stderr": 0.027985699387036423,
"acc_norm": 0.7552742616033755,
"acc_norm_stderr": 0.027985699387036423
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228733,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228733
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8888888888888888,
"acc_stderr": 0.020588491316092368,
"acc_norm": 0.8888888888888888,
"acc_norm_stderr": 0.020588491316092368
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7969348659003831,
"acc_stderr": 0.01438552507661157,
"acc_norm": 0.7969348659003831,
"acc_norm_stderr": 0.01438552507661157
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.024476994076247326,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.024476994076247326
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35083798882681566,
"acc_stderr": 0.015961036675230952,
"acc_norm": 0.35083798882681566,
"acc_norm_stderr": 0.015961036675230952
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.025058503316958147,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.025058503316958147
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.684887459807074,
"acc_stderr": 0.02638527370346449,
"acc_norm": 0.684887459807074,
"acc_norm_stderr": 0.02638527370346449
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.025407197798890162,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.025407197798890162
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4498044328552803,
"acc_stderr": 0.012705721498565109,
"acc_norm": 0.4498044328552803,
"acc_norm_stderr": 0.012705721498565109
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6066176470588235,
"acc_stderr": 0.029674288281311155,
"acc_norm": 0.6066176470588235,
"acc_norm_stderr": 0.029674288281311155
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.019353360547553697,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.019353360547553697
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6816326530612244,
"acc_stderr": 0.029822533793982066,
"acc_norm": 0.6816326530612244,
"acc_norm_stderr": 0.029822533793982066
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623326,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623326
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774711,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774711
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8128654970760234,
"acc_stderr": 0.029913127232368043,
"acc_norm": 0.8128654970760234,
"acc_norm_stderr": 0.029913127232368043
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3023255813953488,
"mc1_stderr": 0.016077509266133026,
"mc2": 0.4479737874141746,
"mc2_stderr": 0.015466809789155087
},
"harness|winogrande|5": {
"acc": 0.7600631412786109,
"acc_stderr": 0.01200207862948574
},
"harness|gsm8k|5": {
"acc": 0.6921910538286581,
"acc_stderr": 0.01271440100992365
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_openchat__openchat-3.5-0106-gemma | open-llm-leaderboard-old | "2024-03-10T01:14:15Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T01:13:54Z" | ---
pretty_name: Evaluation run of openchat/openchat-3.5-0106-gemma
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [openchat/openchat-3.5-0106-gemma](https://huggingface.co/openchat/openchat-3.5-0106-gemma)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_openchat__openchat-3.5-0106-gemma\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T01:11:51.946605](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat-3.5-0106-gemma/blob/main/results_2024-03-10T01-11-51.946605.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6494077725351517,\n\
\ \"acc_stderr\": 0.0322000566954579,\n \"acc_norm\": 0.6496352395612349,\n\
\ \"acc_norm_stderr\": 0.03286748315457976,\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5493155489550888,\n\
\ \"mc2_stderr\": 0.015872860667035752\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6168941979522184,\n \"acc_stderr\": 0.014206472661672877,\n\
\ \"acc_norm\": 0.6467576791808873,\n \"acc_norm_stderr\": 0.013967822714840055\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6145190201155148,\n\
\ \"acc_stderr\": 0.004857140410776736,\n \"acc_norm\": 0.810794662417845,\n\
\ \"acc_norm_stderr\": 0.003908711791243487\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5703703703703704,\n\
\ \"acc_stderr\": 0.042763494943765995,\n \"acc_norm\": 0.5703703703703704,\n\
\ \"acc_norm_stderr\": 0.042763494943765995\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.0373852067611967,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.0373852067611967\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.76,\n\
\ \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n \
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6943396226415094,\n \"acc_stderr\": 0.028353298073322666,\n\
\ \"acc_norm\": 0.6943396226415094,\n \"acc_norm_stderr\": 0.028353298073322666\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7152777777777778,\n\
\ \"acc_stderr\": 0.037738099906869334,\n \"acc_norm\": 0.7152777777777778,\n\
\ \"acc_norm_stderr\": 0.037738099906869334\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909281,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909281\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6297872340425532,\n \"acc_stderr\": 0.03156564682236784,\n\
\ \"acc_norm\": 0.6297872340425532,\n \"acc_norm_stderr\": 0.03156564682236784\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5087719298245614,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.5087719298245614,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.49206349206349204,\n \"acc_stderr\": 0.025748065871673286,\n \"\
acc_norm\": 0.49206349206349204,\n \"acc_norm_stderr\": 0.025748065871673286\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.0351760354036101,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.0351760354036101\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695483,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695483\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.0274796030105388,\n \"acc_norm\"\
: 0.8181818181818182,\n \"acc_norm_stderr\": 0.0274796030105388\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n \
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6384615384615384,\n \"acc_stderr\": 0.02435958146539699,\n \
\ \"acc_norm\": 0.6384615384615384,\n \"acc_norm_stderr\": 0.02435958146539699\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.02882088466625325,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.02882088466625325\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6176470588235294,\n \"acc_stderr\": 0.031566630992154156,\n\
\ \"acc_norm\": 0.6176470588235294,\n \"acc_norm_stderr\": 0.031566630992154156\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530336,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530336\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5601851851851852,\n \"acc_stderr\": 0.0338517797604481,\n \"acc_norm\"\
: 0.5601851851851852,\n \"acc_norm_stderr\": 0.0338517797604481\n },\n\
\ \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\": 0.8235294117647058,\n\
\ \"acc_stderr\": 0.026756401538078955,\n \"acc_norm\": 0.8235294117647058,\n\
\ \"acc_norm_stderr\": 0.026756401538078955\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.8185654008438819,\n \"acc_stderr\": 0.02508596114457966,\n\
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.02508596114457966\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7757847533632287,\n\
\ \"acc_stderr\": 0.027991534258519524,\n \"acc_norm\": 0.7757847533632287,\n\
\ \"acc_norm_stderr\": 0.027991534258519524\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.0345727283691767,\n \"acc_norm\"\
: 0.8264462809917356,\n \"acc_norm_stderr\": 0.0345727283691767\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615768,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615768\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5446428571428571,\n\
\ \"acc_stderr\": 0.04726835553719097,\n \"acc_norm\": 0.5446428571428571,\n\
\ \"acc_norm_stderr\": 0.04726835553719097\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.03675668832233188,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.03675668832233188\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6705202312138728,\n \"acc_stderr\": 0.025305258131879716,\n\
\ \"acc_norm\": 0.6705202312138728,\n \"acc_norm_stderr\": 0.025305258131879716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.21899441340782122,\n\
\ \"acc_stderr\": 0.013831676687303182,\n \"acc_norm\": 0.21899441340782122,\n\
\ \"acc_norm_stderr\": 0.013831676687303182\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6784565916398714,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.6784565916398714,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904212,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904212\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.5019556714471969,\n\
\ \"acc_stderr\": 0.012770138422208628,\n \"acc_norm\": 0.5019556714471969,\n\
\ \"acc_norm_stderr\": 0.012770138422208628\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.02934980313976587,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.02934980313976587\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.0190709855896875,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.0190709855896875\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8009950248756219,\n\
\ \"acc_stderr\": 0.028231365092758406,\n \"acc_norm\": 0.8009950248756219,\n\
\ \"acc_norm_stderr\": 0.028231365092758406\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.81,\n \"acc_stderr\": 0.03942772444036625,\n \
\ \"acc_norm\": 0.81,\n \"acc_norm_stderr\": 0.03942772444036625\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3990208078335373,\n\
\ \"mc1_stderr\": 0.017142825728496763,\n \"mc2\": 0.5493155489550888,\n\
\ \"mc2_stderr\": 0.015872860667035752\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7829518547750592,\n \"acc_stderr\": 0.011585871710209413\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7285822592873389,\n \
\ \"acc_stderr\": 0.012249002026150594\n }\n}\n```"
repo_url: https://huggingface.co/openchat/openchat-3.5-0106-gemma
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-51.946605.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-11-51.946605.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- '**/details_harness|winogrande|5_2024-03-10T01-11-51.946605.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T01-11-51.946605.parquet'
- config_name: results
data_files:
- split: 2024_03_10T01_11_51.946605
path:
- results_2024-03-10T01-11-51.946605.parquet
- split: latest
path:
- results_2024-03-10T01-11-51.946605.parquet
---
# Dataset Card for Evaluation run of openchat/openchat-3.5-0106-gemma
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [openchat/openchat-3.5-0106-gemma](https://huggingface.co/openchat/openchat-3.5-0106-gemma) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_openchat__openchat-3.5-0106-gemma",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T01:11:51.946605](https://huggingface.co/datasets/open-llm-leaderboard/details_openchat__openchat-3.5-0106-gemma/blob/main/results_2024-03-10T01-11-51.946605.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6494077725351517,
"acc_stderr": 0.0322000566954579,
"acc_norm": 0.6496352395612349,
"acc_norm_stderr": 0.03286748315457976,
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5493155489550888,
"mc2_stderr": 0.015872860667035752
},
"harness|arc:challenge|25": {
"acc": 0.6168941979522184,
"acc_stderr": 0.014206472661672877,
"acc_norm": 0.6467576791808873,
"acc_norm_stderr": 0.013967822714840055
},
"harness|hellaswag|10": {
"acc": 0.6145190201155148,
"acc_stderr": 0.004857140410776736,
"acc_norm": 0.810794662417845,
"acc_norm_stderr": 0.003908711791243487
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5703703703703704,
"acc_stderr": 0.042763494943765995,
"acc_norm": 0.5703703703703704,
"acc_norm_stderr": 0.042763494943765995
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.0373852067611967,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.0373852067611967
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6943396226415094,
"acc_stderr": 0.028353298073322666,
"acc_norm": 0.6943396226415094,
"acc_norm_stderr": 0.028353298073322666
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7152777777777778,
"acc_stderr": 0.037738099906869334,
"acc_norm": 0.7152777777777778,
"acc_norm_stderr": 0.037738099906869334
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909281,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909281
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6297872340425532,
"acc_stderr": 0.03156564682236784,
"acc_norm": 0.6297872340425532,
"acc_norm_stderr": 0.03156564682236784
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5087719298245614,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.5087719298245614,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.025748065871673286,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.025748065871673286
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.0351760354036101,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.0351760354036101
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695483,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695483
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.0274796030105388,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.0274796030105388
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6384615384615384,
"acc_stderr": 0.02435958146539699,
"acc_norm": 0.6384615384615384,
"acc_norm_stderr": 0.02435958146539699
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.02882088466625325,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.02882088466625325
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6176470588235294,
"acc_stderr": 0.031566630992154156,
"acc_norm": 0.6176470588235294,
"acc_norm_stderr": 0.031566630992154156
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530336,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530336
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5601851851851852,
"acc_stderr": 0.0338517797604481,
"acc_norm": 0.5601851851851852,
"acc_norm_stderr": 0.0338517797604481
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078955,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078955
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.02508596114457966,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.02508596114457966
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7757847533632287,
"acc_stderr": 0.027991534258519524,
"acc_norm": 0.7757847533632287,
"acc_norm_stderr": 0.027991534258519524
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.0345727283691767,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.0345727283691767
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615768,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615768
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5446428571428571,
"acc_stderr": 0.04726835553719097,
"acc_norm": 0.5446428571428571,
"acc_norm_stderr": 0.04726835553719097
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.03675668832233188,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.03675668832233188
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6705202312138728,
"acc_stderr": 0.025305258131879716,
"acc_norm": 0.6705202312138728,
"acc_norm_stderr": 0.025305258131879716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.21899441340782122,
"acc_stderr": 0.013831676687303182,
"acc_norm": 0.21899441340782122,
"acc_norm_stderr": 0.013831676687303182
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6784565916398714,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.6784565916398714,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904212,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904212
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.5019556714471969,
"acc_stderr": 0.012770138422208628,
"acc_norm": 0.5019556714471969,
"acc_norm_stderr": 0.012770138422208628
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.02934980313976587,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.02934980313976587
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.0190709855896875,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.0190709855896875
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8009950248756219,
"acc_stderr": 0.028231365092758406,
"acc_norm": 0.8009950248756219,
"acc_norm_stderr": 0.028231365092758406
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.81,
"acc_stderr": 0.03942772444036625,
"acc_norm": 0.81,
"acc_norm_stderr": 0.03942772444036625
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3990208078335373,
"mc1_stderr": 0.017142825728496763,
"mc2": 0.5493155489550888,
"mc2_stderr": 0.015872860667035752
},
"harness|winogrande|5": {
"acc": 0.7829518547750592,
"acc_stderr": 0.011585871710209413
},
"harness|gsm8k|5": {
"acc": 0.7285822592873389,
"acc_stderr": 0.012249002026150594
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_InnerI__I-Code-NousLlama7B-slerp | open-llm-leaderboard-old | "2024-03-10T01:14:26Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T01:14:05Z" | ---
pretty_name: Evaluation run of InnerI/I-Code-NousLlama7B-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [InnerI/I-Code-NousLlama7B-slerp](https://huggingface.co/InnerI/I-Code-NousLlama7B-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_InnerI__I-Code-NousLlama7B-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T01:11:46.189486](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__I-Code-NousLlama7B-slerp/blob/main/results_2024-03-10T01-11-46.189486.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2897605521045545,\n\
\ \"acc_stderr\": 0.03188468910540453,\n \"acc_norm\": 0.29130314890817316,\n\
\ \"acc_norm_stderr\": 0.03269780079088881,\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.3617412829286974,\n\
\ \"mc2_stderr\": 0.014985922251134924\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.38139931740614336,\n \"acc_stderr\": 0.014194389086685261,\n\
\ \"acc_norm\": 0.4035836177474403,\n \"acc_norm_stderr\": 0.014337158914268447\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46723760207130055,\n\
\ \"acc_stderr\": 0.0049790580784787,\n \"acc_norm\": 0.6105357498506274,\n\
\ \"acc_norm_stderr\": 0.004866322258335979\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.23703703703703705,\n\
\ \"acc_stderr\": 0.03673731683969506,\n \"acc_norm\": 0.23703703703703705,\n\
\ \"acc_norm_stderr\": 0.03673731683969506\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.21710526315789475,\n \"acc_stderr\": 0.03355045304882922,\n\
\ \"acc_norm\": 0.21710526315789475,\n \"acc_norm_stderr\": 0.03355045304882922\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.28,\n\
\ \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n \
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.28679245283018867,\n \"acc_stderr\": 0.027834912527544064,\n\
\ \"acc_norm\": 0.28679245283018867,\n \"acc_norm_stderr\": 0.027834912527544064\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.17,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.17,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.36,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.36,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2254335260115607,\n\
\ \"acc_stderr\": 0.03186209851641145,\n \"acc_norm\": 0.2254335260115607,\n\
\ \"acc_norm_stderr\": 0.03186209851641145\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.19607843137254902,\n \"acc_stderr\": 0.03950581861179962,\n\
\ \"acc_norm\": 0.19607843137254902,\n \"acc_norm_stderr\": 0.03950581861179962\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \"acc_norm\": 0.38,\n\
\ \"acc_norm_stderr\": 0.04878317312145632\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.3021276595744681,\n \"acc_stderr\": 0.030017554471880554,\n\
\ \"acc_norm\": 0.3021276595744681,\n \"acc_norm_stderr\": 0.030017554471880554\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2894736842105263,\n\
\ \"acc_stderr\": 0.04266339443159394,\n \"acc_norm\": 0.2894736842105263,\n\
\ \"acc_norm_stderr\": 0.04266339443159394\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2689655172413793,\n \"acc_stderr\": 0.03695183311650232,\n\
\ \"acc_norm\": 0.2689655172413793,\n \"acc_norm_stderr\": 0.03695183311650232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.02141168439369418,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.02141168439369418\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047182,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047182\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.22903225806451613,\n \"acc_stderr\": 0.023904914311782658,\n \"\
acc_norm\": 0.22903225806451613,\n \"acc_norm_stderr\": 0.023904914311782658\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n \"\
acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.32323232323232326,\n \"acc_stderr\": 0.03332299921070645,\n \"\
acc_norm\": 0.32323232323232326,\n \"acc_norm_stderr\": 0.03332299921070645\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.2538860103626943,\n \"acc_stderr\": 0.03141024780565319,\n\
\ \"acc_norm\": 0.2538860103626943,\n \"acc_norm_stderr\": 0.03141024780565319\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.258974358974359,\n \"acc_stderr\": 0.022211106810061658,\n \
\ \"acc_norm\": 0.258974358974359,\n \"acc_norm_stderr\": 0.022211106810061658\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.026842057873833706,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.026842057873833706\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.030388353551886838,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.030388353551886838\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.25165562913907286,\n \"acc_stderr\": 0.035433042343899844,\n \"\
acc_norm\": 0.25165562913907286,\n \"acc_norm_stderr\": 0.035433042343899844\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.26055045871559634,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.26055045871559634,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.17592592592592593,\n \"acc_stderr\": 0.025967420958258533,\n \"\
acc_norm\": 0.17592592592592593,\n \"acc_norm_stderr\": 0.025967420958258533\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22549019607843138,\n \"acc_stderr\": 0.029331162294251745,\n \"\
acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.029331162294251745\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.29957805907172996,\n \"acc_stderr\": 0.029818024749753095,\n \
\ \"acc_norm\": 0.29957805907172996,\n \"acc_norm_stderr\": 0.029818024749753095\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.4125560538116592,\n\
\ \"acc_stderr\": 0.03304062175449296,\n \"acc_norm\": 0.4125560538116592,\n\
\ \"acc_norm_stderr\": 0.03304062175449296\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.39669421487603307,\n \"acc_stderr\": 0.044658697805310094,\n \"\
acc_norm\": 0.39669421487603307,\n \"acc_norm_stderr\": 0.044658697805310094\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.3148148148148148,\n\
\ \"acc_stderr\": 0.04489931073591312,\n \"acc_norm\": 0.3148148148148148,\n\
\ \"acc_norm_stderr\": 0.04489931073591312\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.25766871165644173,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.25766871165644173,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3392857142857143,\n\
\ \"acc_stderr\": 0.04493949068613539,\n \"acc_norm\": 0.3392857142857143,\n\
\ \"acc_norm_stderr\": 0.04493949068613539\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3106796116504854,\n \"acc_stderr\": 0.04582124160161551,\n\
\ \"acc_norm\": 0.3106796116504854,\n \"acc_norm_stderr\": 0.04582124160161551\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.4700854700854701,\n\
\ \"acc_stderr\": 0.03269741106812443,\n \"acc_norm\": 0.4700854700854701,\n\
\ \"acc_norm_stderr\": 0.03269741106812443\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.3231162196679438,\n\
\ \"acc_stderr\": 0.016723726512343048,\n \"acc_norm\": 0.3231162196679438,\n\
\ \"acc_norm_stderr\": 0.016723726512343048\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2774566473988439,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.2774566473988439,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925302,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925302\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2875816993464052,\n \"acc_stderr\": 0.02591780611714716,\n\
\ \"acc_norm\": 0.2875816993464052,\n \"acc_norm_stderr\": 0.02591780611714716\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3215434083601286,\n\
\ \"acc_stderr\": 0.026527724079528872,\n \"acc_norm\": 0.3215434083601286,\n\
\ \"acc_norm_stderr\": 0.026527724079528872\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.29012345679012347,\n \"acc_stderr\": 0.025251173936495026,\n\
\ \"acc_norm\": 0.29012345679012347,\n \"acc_norm_stderr\": 0.025251173936495026\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24822695035460993,\n \"acc_stderr\": 0.02577001564429039,\n \
\ \"acc_norm\": 0.24822695035460993,\n \"acc_norm_stderr\": 0.02577001564429039\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.258148631029987,\n\
\ \"acc_stderr\": 0.011176923719313397,\n \"acc_norm\": 0.258148631029987,\n\
\ \"acc_norm_stderr\": 0.011176923719313397\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2610294117647059,\n \"acc_stderr\": 0.02667925227010312,\n\
\ \"acc_norm\": 0.2610294117647059,\n \"acc_norm_stderr\": 0.02667925227010312\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.28431372549019607,\n \"acc_stderr\": 0.01824902441120767,\n \
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.01824902441120767\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.35454545454545455,\n\
\ \"acc_stderr\": 0.04582004841505416,\n \"acc_norm\": 0.35454545454545455,\n\
\ \"acc_norm_stderr\": 0.04582004841505416\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.027682979522960224,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.027682979522960224\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.32338308457711445,\n\
\ \"acc_stderr\": 0.03307615947979034,\n \"acc_norm\": 0.32338308457711445,\n\
\ \"acc_norm_stderr\": 0.03307615947979034\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3313253012048193,\n\
\ \"acc_stderr\": 0.03664314777288087,\n \"acc_norm\": 0.3313253012048193,\n\
\ \"acc_norm_stderr\": 0.03664314777288087\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.36257309941520466,\n \"acc_stderr\": 0.0368713061556206,\n\
\ \"acc_norm\": 0.36257309941520466,\n \"acc_norm_stderr\": 0.0368713061556206\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23378212974296206,\n\
\ \"mc1_stderr\": 0.014816195991931583,\n \"mc2\": 0.3617412829286974,\n\
\ \"mc2_stderr\": 0.014985922251134924\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6464088397790055,\n \"acc_stderr\": 0.013436541262599948\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0075815011372251705,\n \
\ \"acc_stderr\": 0.0023892815120772084\n }\n}\n```"
repo_url: https://huggingface.co/InnerI/I-Code-NousLlama7B-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-46.189486.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-11-46.189486.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- '**/details_harness|winogrande|5_2024-03-10T01-11-46.189486.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T01-11-46.189486.parquet'
- config_name: results
data_files:
- split: 2024_03_10T01_11_46.189486
path:
- results_2024-03-10T01-11-46.189486.parquet
- split: latest
path:
- results_2024-03-10T01-11-46.189486.parquet
---
# Dataset Card for Evaluation run of InnerI/I-Code-NousLlama7B-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [InnerI/I-Code-NousLlama7B-slerp](https://huggingface.co/InnerI/I-Code-NousLlama7B-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_InnerI__I-Code-NousLlama7B-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T01:11:46.189486](https://huggingface.co/datasets/open-llm-leaderboard/details_InnerI__I-Code-NousLlama7B-slerp/blob/main/results_2024-03-10T01-11-46.189486.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2897605521045545,
"acc_stderr": 0.03188468910540453,
"acc_norm": 0.29130314890817316,
"acc_norm_stderr": 0.03269780079088881,
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.3617412829286974,
"mc2_stderr": 0.014985922251134924
},
"harness|arc:challenge|25": {
"acc": 0.38139931740614336,
"acc_stderr": 0.014194389086685261,
"acc_norm": 0.4035836177474403,
"acc_norm_stderr": 0.014337158914268447
},
"harness|hellaswag|10": {
"acc": 0.46723760207130055,
"acc_stderr": 0.0049790580784787,
"acc_norm": 0.6105357498506274,
"acc_norm_stderr": 0.004866322258335979
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.23703703703703705,
"acc_stderr": 0.03673731683969506,
"acc_norm": 0.23703703703703705,
"acc_norm_stderr": 0.03673731683969506
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.21710526315789475,
"acc_stderr": 0.03355045304882922,
"acc_norm": 0.21710526315789475,
"acc_norm_stderr": 0.03355045304882922
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.28679245283018867,
"acc_stderr": 0.027834912527544064,
"acc_norm": 0.28679245283018867,
"acc_norm_stderr": 0.027834912527544064
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.17,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.17,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2254335260115607,
"acc_stderr": 0.03186209851641145,
"acc_norm": 0.2254335260115607,
"acc_norm_stderr": 0.03186209851641145
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.19607843137254902,
"acc_stderr": 0.03950581861179962,
"acc_norm": 0.19607843137254902,
"acc_norm_stderr": 0.03950581861179962
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.3021276595744681,
"acc_stderr": 0.030017554471880554,
"acc_norm": 0.3021276595744681,
"acc_norm_stderr": 0.030017554471880554
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2894736842105263,
"acc_stderr": 0.04266339443159394,
"acc_norm": 0.2894736842105263,
"acc_norm_stderr": 0.04266339443159394
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2689655172413793,
"acc_stderr": 0.03695183311650232,
"acc_norm": 0.2689655172413793,
"acc_norm_stderr": 0.03695183311650232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.02141168439369418,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.02141168439369418
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047182,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047182
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.023904914311782658,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.023904914311782658
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.32323232323232326,
"acc_stderr": 0.03332299921070645,
"acc_norm": 0.32323232323232326,
"acc_norm_stderr": 0.03332299921070645
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.2538860103626943,
"acc_stderr": 0.03141024780565319,
"acc_norm": 0.2538860103626943,
"acc_norm_stderr": 0.03141024780565319
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.258974358974359,
"acc_stderr": 0.022211106810061658,
"acc_norm": 0.258974358974359,
"acc_norm_stderr": 0.022211106810061658
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.026842057873833706,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.026842057873833706
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.030388353551886838,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.030388353551886838
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.25165562913907286,
"acc_stderr": 0.035433042343899844,
"acc_norm": 0.25165562913907286,
"acc_norm_stderr": 0.035433042343899844
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.26055045871559634,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.26055045871559634,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.17592592592592593,
"acc_stderr": 0.025967420958258533,
"acc_norm": 0.17592592592592593,
"acc_norm_stderr": 0.025967420958258533
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.029331162294251745,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.029331162294251745
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.29957805907172996,
"acc_stderr": 0.029818024749753095,
"acc_norm": 0.29957805907172996,
"acc_norm_stderr": 0.029818024749753095
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.4125560538116592,
"acc_stderr": 0.03304062175449296,
"acc_norm": 0.4125560538116592,
"acc_norm_stderr": 0.03304062175449296
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.39669421487603307,
"acc_stderr": 0.044658697805310094,
"acc_norm": 0.39669421487603307,
"acc_norm_stderr": 0.044658697805310094
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.3148148148148148,
"acc_stderr": 0.04489931073591312,
"acc_norm": 0.3148148148148148,
"acc_norm_stderr": 0.04489931073591312
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.25766871165644173,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.25766871165644173,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3392857142857143,
"acc_stderr": 0.04493949068613539,
"acc_norm": 0.3392857142857143,
"acc_norm_stderr": 0.04493949068613539
},
"harness|hendrycksTest-management|5": {
"acc": 0.3106796116504854,
"acc_stderr": 0.04582124160161551,
"acc_norm": 0.3106796116504854,
"acc_norm_stderr": 0.04582124160161551
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.4700854700854701,
"acc_stderr": 0.03269741106812443,
"acc_norm": 0.4700854700854701,
"acc_norm_stderr": 0.03269741106812443
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.3231162196679438,
"acc_stderr": 0.016723726512343048,
"acc_norm": 0.3231162196679438,
"acc_norm_stderr": 0.016723726512343048
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2774566473988439,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.2774566473988439,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925302,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925302
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2875816993464052,
"acc_stderr": 0.02591780611714716,
"acc_norm": 0.2875816993464052,
"acc_norm_stderr": 0.02591780611714716
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3215434083601286,
"acc_stderr": 0.026527724079528872,
"acc_norm": 0.3215434083601286,
"acc_norm_stderr": 0.026527724079528872
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.29012345679012347,
"acc_stderr": 0.025251173936495026,
"acc_norm": 0.29012345679012347,
"acc_norm_stderr": 0.025251173936495026
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24822695035460993,
"acc_stderr": 0.02577001564429039,
"acc_norm": 0.24822695035460993,
"acc_norm_stderr": 0.02577001564429039
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.258148631029987,
"acc_stderr": 0.011176923719313397,
"acc_norm": 0.258148631029987,
"acc_norm_stderr": 0.011176923719313397
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2610294117647059,
"acc_stderr": 0.02667925227010312,
"acc_norm": 0.2610294117647059,
"acc_norm_stderr": 0.02667925227010312
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.01824902441120767,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.01824902441120767
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.35454545454545455,
"acc_stderr": 0.04582004841505416,
"acc_norm": 0.35454545454545455,
"acc_norm_stderr": 0.04582004841505416
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.027682979522960224,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.027682979522960224
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.32338308457711445,
"acc_stderr": 0.03307615947979034,
"acc_norm": 0.32338308457711445,
"acc_norm_stderr": 0.03307615947979034
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3313253012048193,
"acc_stderr": 0.03664314777288087,
"acc_norm": 0.3313253012048193,
"acc_norm_stderr": 0.03664314777288087
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.36257309941520466,
"acc_stderr": 0.0368713061556206,
"acc_norm": 0.36257309941520466,
"acc_norm_stderr": 0.0368713061556206
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23378212974296206,
"mc1_stderr": 0.014816195991931583,
"mc2": 0.3617412829286974,
"mc2_stderr": 0.014985922251134924
},
"harness|winogrande|5": {
"acc": 0.6464088397790055,
"acc_stderr": 0.013436541262599948
},
"harness|gsm8k|5": {
"acc": 0.0075815011372251705,
"acc_stderr": 0.0023892815120772084
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_nbeerbower__Flammen-Trismegistus-7B | open-llm-leaderboard-old | "2024-03-10T01:19:14Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T01:18:53Z" | ---
pretty_name: Evaluation run of nbeerbower/Flammen-Trismegistus-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [nbeerbower/Flammen-Trismegistus-7B](https://huggingface.co/nbeerbower/Flammen-Trismegistus-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_nbeerbower__Flammen-Trismegistus-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T01:16:39.494552](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__Flammen-Trismegistus-7B/blob/main/results_2024-03-10T01-16-39.494552.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6253524458134281,\n\
\ \"acc_stderr\": 0.03254095334023857,\n \"acc_norm\": 0.6285846132548462,\n\
\ \"acc_norm_stderr\": 0.033185314751337555,\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.017225627083660867,\n \"mc2\": 0.5712348607371314,\n\
\ \"mc2_stderr\": 0.016116004585264337\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6049488054607508,\n \"acc_stderr\": 0.014285898292938167,\n\
\ \"acc_norm\": 0.6399317406143344,\n \"acc_norm_stderr\": 0.01402751681458519\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.667894841665007,\n\
\ \"acc_stderr\": 0.0047000596713746385,\n \"acc_norm\": 0.8479386576379208,\n\
\ \"acc_norm_stderr\": 0.0035834648107534585\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6578947368421053,\n \"acc_stderr\": 0.03860731599316091,\n\
\ \"acc_norm\": 0.6578947368421053,\n \"acc_norm_stderr\": 0.03860731599316091\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.57,\n\
\ \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.57,\n \
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933714,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933714\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7083333333333334,\n\
\ \"acc_stderr\": 0.038009680605548594,\n \"acc_norm\": 0.7083333333333334,\n\
\ \"acc_norm_stderr\": 0.038009680605548594\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5574468085106383,\n \"acc_stderr\": 0.03246956919789958,\n\
\ \"acc_norm\": 0.5574468085106383,\n \"acc_norm_stderr\": 0.03246956919789958\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3888888888888889,\n\
\ \"acc_stderr\": 0.04360314860077459,\n \"acc_norm\": 0.3888888888888889,\n\
\ \"acc_norm_stderr\": 0.04360314860077459\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7677419354838709,\n \"acc_stderr\": 0.024022256130308235,\n \"\
acc_norm\": 0.7677419354838709,\n \"acc_norm_stderr\": 0.024022256130308235\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4876847290640394,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.4876847290640394,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009182,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009182\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586808,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586808\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8601036269430051,\n \"acc_stderr\": 0.025033870583015184,\n\
\ \"acc_norm\": 0.8601036269430051,\n \"acc_norm_stderr\": 0.025033870583015184\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6153846153846154,\n \"acc_stderr\": 0.024666744915187205,\n\
\ \"acc_norm\": 0.6153846153846154,\n \"acc_norm_stderr\": 0.024666744915187205\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32592592592592595,\n \"acc_stderr\": 0.02857834836547308,\n \
\ \"acc_norm\": 0.32592592592592595,\n \"acc_norm_stderr\": 0.02857834836547308\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135356,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135356\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8036697247706422,\n\
\ \"acc_stderr\": 0.01703071933915434,\n \"acc_norm\": 0.8036697247706422,\n\
\ \"acc_norm_stderr\": 0.01703071933915434\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.47685185185185186,\n \"acc_stderr\": 0.03406315360711507,\n\
\ \"acc_norm\": 0.47685185185185186,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967407,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967407\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7932489451476793,\n \"acc_stderr\": 0.026361651668389094,\n \
\ \"acc_norm\": 0.7932489451476793,\n \"acc_norm_stderr\": 0.026361651668389094\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596914,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596914\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8058748403575989,\n\
\ \"acc_stderr\": 0.014143970276657567,\n \"acc_norm\": 0.8058748403575989,\n\
\ \"acc_norm_stderr\": 0.014143970276657567\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7109826589595376,\n \"acc_stderr\": 0.02440517393578323,\n\
\ \"acc_norm\": 0.7109826589595376,\n \"acc_norm_stderr\": 0.02440517393578323\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.35977653631284917,\n\
\ \"acc_stderr\": 0.016051419760310263,\n \"acc_norm\": 0.35977653631284917,\n\
\ \"acc_norm_stderr\": 0.016051419760310263\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6928104575163399,\n \"acc_stderr\": 0.026415601914388992,\n\
\ \"acc_norm\": 0.6928104575163399,\n \"acc_norm_stderr\": 0.026415601914388992\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6752411575562701,\n\
\ \"acc_stderr\": 0.026596782287697043,\n \"acc_norm\": 0.6752411575562701,\n\
\ \"acc_norm_stderr\": 0.026596782287697043\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6790123456790124,\n \"acc_stderr\": 0.02597656601086274,\n\
\ \"acc_norm\": 0.6790123456790124,\n \"acc_norm_stderr\": 0.02597656601086274\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46099290780141844,\n \"acc_stderr\": 0.02973659252642444,\n \
\ \"acc_norm\": 0.46099290780141844,\n \"acc_norm_stderr\": 0.02973659252642444\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45436766623207303,\n\
\ \"acc_stderr\": 0.012716941720734804,\n \"acc_norm\": 0.45436766623207303,\n\
\ \"acc_norm_stderr\": 0.012716941720734804\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.0290294228156814,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.0290294228156814\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6454248366013072,\n \"acc_stderr\": 0.0193533605475537,\n \
\ \"acc_norm\": 0.6454248366013072,\n \"acc_norm_stderr\": 0.0193533605475537\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6979591836734694,\n \"acc_stderr\": 0.029393609319879804,\n\
\ \"acc_norm\": 0.6979591836734694,\n \"acc_norm_stderr\": 0.029393609319879804\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8208955223880597,\n\
\ \"acc_stderr\": 0.027113286753111837,\n \"acc_norm\": 0.8208955223880597,\n\
\ \"acc_norm_stderr\": 0.027113286753111837\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.82,\n \"acc_stderr\": 0.03861229196653694,\n \
\ \"acc_norm\": 0.82,\n \"acc_norm_stderr\": 0.03861229196653694\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4112607099143207,\n\
\ \"mc1_stderr\": 0.017225627083660867,\n \"mc2\": 0.5712348607371314,\n\
\ \"mc2_stderr\": 0.016116004585264337\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650872\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.510235026535254,\n \
\ \"acc_stderr\": 0.013769598923012384\n }\n}\n```"
repo_url: https://huggingface.co/nbeerbower/Flammen-Trismegistus-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-16-39.494552.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-16-39.494552.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- '**/details_harness|winogrande|5_2024-03-10T01-16-39.494552.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T01-16-39.494552.parquet'
- config_name: results
data_files:
- split: 2024_03_10T01_16_39.494552
path:
- results_2024-03-10T01-16-39.494552.parquet
- split: latest
path:
- results_2024-03-10T01-16-39.494552.parquet
---
# Dataset Card for Evaluation run of nbeerbower/Flammen-Trismegistus-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [nbeerbower/Flammen-Trismegistus-7B](https://huggingface.co/nbeerbower/Flammen-Trismegistus-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_nbeerbower__Flammen-Trismegistus-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T01:16:39.494552](https://huggingface.co/datasets/open-llm-leaderboard/details_nbeerbower__Flammen-Trismegistus-7B/blob/main/results_2024-03-10T01-16-39.494552.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6253524458134281,
"acc_stderr": 0.03254095334023857,
"acc_norm": 0.6285846132548462,
"acc_norm_stderr": 0.033185314751337555,
"mc1": 0.4112607099143207,
"mc1_stderr": 0.017225627083660867,
"mc2": 0.5712348607371314,
"mc2_stderr": 0.016116004585264337
},
"harness|arc:challenge|25": {
"acc": 0.6049488054607508,
"acc_stderr": 0.014285898292938167,
"acc_norm": 0.6399317406143344,
"acc_norm_stderr": 0.01402751681458519
},
"harness|hellaswag|10": {
"acc": 0.667894841665007,
"acc_stderr": 0.0047000596713746385,
"acc_norm": 0.8479386576379208,
"acc_norm_stderr": 0.0035834648107534585
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6578947368421053,
"acc_stderr": 0.03860731599316091,
"acc_norm": 0.6578947368421053,
"acc_norm_stderr": 0.03860731599316091
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.57,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.57,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933714,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933714
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7083333333333334,
"acc_stderr": 0.038009680605548594,
"acc_norm": 0.7083333333333334,
"acc_norm_stderr": 0.038009680605548594
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5574468085106383,
"acc_stderr": 0.03246956919789958,
"acc_norm": 0.5574468085106383,
"acc_norm_stderr": 0.03246956919789958
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3888888888888889,
"acc_stderr": 0.04360314860077459,
"acc_norm": 0.3888888888888889,
"acc_norm_stderr": 0.04360314860077459
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7677419354838709,
"acc_stderr": 0.024022256130308235,
"acc_norm": 0.7677419354838709,
"acc_norm_stderr": 0.024022256130308235
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4876847290640394,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.4876847290640394,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009182,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009182
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586808,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586808
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8601036269430051,
"acc_stderr": 0.025033870583015184,
"acc_norm": 0.8601036269430051,
"acc_norm_stderr": 0.025033870583015184
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6153846153846154,
"acc_stderr": 0.024666744915187205,
"acc_norm": 0.6153846153846154,
"acc_norm_stderr": 0.024666744915187205
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32592592592592595,
"acc_stderr": 0.02857834836547308,
"acc_norm": 0.32592592592592595,
"acc_norm_stderr": 0.02857834836547308
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135356,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135356
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8036697247706422,
"acc_stderr": 0.01703071933915434,
"acc_norm": 0.8036697247706422,
"acc_norm_stderr": 0.01703071933915434
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.47685185185185186,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.47685185185185186,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967407,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967407
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7932489451476793,
"acc_stderr": 0.026361651668389094,
"acc_norm": 0.7932489451476793,
"acc_norm_stderr": 0.026361651668389094
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596914,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596914
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.72,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8058748403575989,
"acc_stderr": 0.014143970276657567,
"acc_norm": 0.8058748403575989,
"acc_norm_stderr": 0.014143970276657567
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7109826589595376,
"acc_stderr": 0.02440517393578323,
"acc_norm": 0.7109826589595376,
"acc_norm_stderr": 0.02440517393578323
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.35977653631284917,
"acc_stderr": 0.016051419760310263,
"acc_norm": 0.35977653631284917,
"acc_norm_stderr": 0.016051419760310263
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6928104575163399,
"acc_stderr": 0.026415601914388992,
"acc_norm": 0.6928104575163399,
"acc_norm_stderr": 0.026415601914388992
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6752411575562701,
"acc_stderr": 0.026596782287697043,
"acc_norm": 0.6752411575562701,
"acc_norm_stderr": 0.026596782287697043
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6790123456790124,
"acc_stderr": 0.02597656601086274,
"acc_norm": 0.6790123456790124,
"acc_norm_stderr": 0.02597656601086274
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46099290780141844,
"acc_stderr": 0.02973659252642444,
"acc_norm": 0.46099290780141844,
"acc_norm_stderr": 0.02973659252642444
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45436766623207303,
"acc_stderr": 0.012716941720734804,
"acc_norm": 0.45436766623207303,
"acc_norm_stderr": 0.012716941720734804
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.0290294228156814,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.0290294228156814
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6454248366013072,
"acc_stderr": 0.0193533605475537,
"acc_norm": 0.6454248366013072,
"acc_norm_stderr": 0.0193533605475537
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6979591836734694,
"acc_stderr": 0.029393609319879804,
"acc_norm": 0.6979591836734694,
"acc_norm_stderr": 0.029393609319879804
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8208955223880597,
"acc_stderr": 0.027113286753111837,
"acc_norm": 0.8208955223880597,
"acc_norm_stderr": 0.027113286753111837
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.82,
"acc_stderr": 0.03861229196653694,
"acc_norm": 0.82,
"acc_norm_stderr": 0.03861229196653694
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4112607099143207,
"mc1_stderr": 0.017225627083660867,
"mc2": 0.5712348607371314,
"mc2_stderr": 0.016116004585264337
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650872
},
"harness|gsm8k|5": {
"acc": 0.510235026535254,
"acc_stderr": 0.013769598923012384
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_NExtNewChattingAI__Mutliverse_model_official | open-llm-leaderboard-old | "2024-03-10T01:28:17Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T01:27:56Z" | ---
pretty_name: Evaluation run of NExtNewChattingAI/Mutliverse_model_official
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [NExtNewChattingAI/Mutliverse_model_official](https://huggingface.co/NExtNewChattingAI/Mutliverse_model_official)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_NExtNewChattingAI__Mutliverse_model_official\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T01:25:40.754739](https://huggingface.co/datasets/open-llm-leaderboard/details_NExtNewChattingAI__Mutliverse_model_official/blob/main/results_2024-03-10T01-25-40.754739.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6508050815217578,\n\
\ \"acc_stderr\": 0.03206577150040621,\n \"acc_norm\": 0.649812675164199,\n\
\ \"acc_norm_stderr\": 0.03274163164992761,\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7793251604495214,\n\
\ \"mc2_stderr\": 0.01369222047572167\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7090443686006825,\n \"acc_stderr\": 0.01327307786590759,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.715893248356901,\n\
\ \"acc_stderr\": 0.004500662294697923,\n \"acc_norm\": 0.891256721768572,\n\
\ \"acc_norm_stderr\": 0.0031068060075356277\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.02815283794249387,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.02815283794249387\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620333\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.56,\n \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n\
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.04858083574266345,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.04858083574266345\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642518,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642518\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.03958027231121569,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.03958027231121569\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8431372549019608,\n \"acc_stderr\": 0.02552472232455335,\n \"\
acc_norm\": 0.8431372549019608,\n \"acc_norm_stderr\": 0.02552472232455335\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6816143497757847,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.6816143497757847,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7852760736196319,\n \"acc_stderr\": 0.032262193772867744,\n\
\ \"acc_norm\": 0.7852760736196319,\n \"acc_norm_stderr\": 0.032262193772867744\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368983,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368983\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4245810055865922,\n\
\ \"acc_stderr\": 0.016531170993278888,\n \"acc_norm\": 0.4245810055865922,\n\
\ \"acc_norm_stderr\": 0.016531170993278888\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.02608270069539966,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.02608270069539966\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4758800521512386,\n\
\ \"acc_stderr\": 0.012755368722863937,\n \"acc_norm\": 0.4758800521512386,\n\
\ \"acc_norm_stderr\": 0.012755368722863937\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.684640522875817,\n \"acc_stderr\": 0.01879808628488689,\n \
\ \"acc_norm\": 0.684640522875817,\n \"acc_norm_stderr\": 0.01879808628488689\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6303549571603427,\n\
\ \"mc1_stderr\": 0.01689818070697388,\n \"mc2\": 0.7793251604495214,\n\
\ \"mc2_stderr\": 0.01369222047572167\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.850828729281768,\n \"acc_stderr\": 0.010012598805627295\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7043214556482184,\n \
\ \"acc_stderr\": 0.012570068947898779\n }\n}\n```"
repo_url: https://huggingface.co/NExtNewChattingAI/Mutliverse_model_official
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-25-40.754739.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T01-25-40.754739.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- '**/details_harness|winogrande|5_2024-03-10T01-25-40.754739.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T01-25-40.754739.parquet'
- config_name: results
data_files:
- split: 2024_03_10T01_25_40.754739
path:
- results_2024-03-10T01-25-40.754739.parquet
- split: latest
path:
- results_2024-03-10T01-25-40.754739.parquet
---
# Dataset Card for Evaluation run of NExtNewChattingAI/Mutliverse_model_official
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [NExtNewChattingAI/Mutliverse_model_official](https://huggingface.co/NExtNewChattingAI/Mutliverse_model_official) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_NExtNewChattingAI__Mutliverse_model_official",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T01:25:40.754739](https://huggingface.co/datasets/open-llm-leaderboard/details_NExtNewChattingAI__Mutliverse_model_official/blob/main/results_2024-03-10T01-25-40.754739.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6508050815217578,
"acc_stderr": 0.03206577150040621,
"acc_norm": 0.649812675164199,
"acc_norm_stderr": 0.03274163164992761,
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7793251604495214,
"mc2_stderr": 0.01369222047572167
},
"harness|arc:challenge|25": {
"acc": 0.7090443686006825,
"acc_stderr": 0.01327307786590759,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.715893248356901,
"acc_stderr": 0.004500662294697923,
"acc_norm": 0.891256721768572,
"acc_norm_stderr": 0.0031068060075356277
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.02815283794249387,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.02815283794249387
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.03958027231121569,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.03958027231121569
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8431372549019608,
"acc_stderr": 0.02552472232455335,
"acc_norm": 0.8431372549019608,
"acc_norm_stderr": 0.02552472232455335
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6816143497757847,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.6816143497757847,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7852760736196319,
"acc_stderr": 0.032262193772867744,
"acc_norm": 0.7852760736196319,
"acc_norm_stderr": 0.032262193772867744
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368983,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368983
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4245810055865922,
"acc_stderr": 0.016531170993278888,
"acc_norm": 0.4245810055865922,
"acc_norm_stderr": 0.016531170993278888
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.02608270069539966,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.02608270069539966
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4758800521512386,
"acc_stderr": 0.012755368722863937,
"acc_norm": 0.4758800521512386,
"acc_norm_stderr": 0.012755368722863937
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.684640522875817,
"acc_stderr": 0.01879808628488689,
"acc_norm": 0.684640522875817,
"acc_norm_stderr": 0.01879808628488689
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6303549571603427,
"mc1_stderr": 0.01689818070697388,
"mc2": 0.7793251604495214,
"mc2_stderr": 0.01369222047572167
},
"harness|winogrande|5": {
"acc": 0.850828729281768,
"acc_stderr": 0.010012598805627295
},
"harness|gsm8k|5": {
"acc": 0.7043214556482184,
"acc_stderr": 0.012570068947898779
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
SGBTalha/FelipeNetoSemPod | SGBTalha | "2024-03-10T01:39:26Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T01:38:56Z" | ---
license: openrail
---
|
WarpWingHF/QTRAGPT | WarpWingHF | "2024-03-10T01:56:48Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-03-10T01:56:48Z" | ---
license: mit
---
|
pythonist/staf_alpa_kkm | pythonist | "2024-03-10T02:10:10Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T02:09:45Z" | ---
license: mit
---
|
qwer0213/CWC_dataset | qwer0213 | "2024-03-10T02:23:00Z" | 0 | 0 | [
"license:cc-by-4.0",
"region:us"
] | null | "2024-03-10T02:16:06Z" | ---
license: cc-by-4.0
---
|
Pankajric22/test | Pankajric22 | "2024-03-10T02:26:01Z" | 0 | 0 | [
"task_categories:image-classification",
"annotations_creators:crowdsourced",
"language_creators:crowdsourced",
"source_datasets:extended",
"language:en",
"license:apache-2.0",
"size_categories:1K<n<10K",
"region:us"
] | [
"image-classification"
] | "2024-03-10T02:23:55Z" | ---
annotations_creators:
- crowdsourced
language_creators:
- crowdsourced
language:
- en
license:
- apache-2.0
multilinguality: []
size_categories:
- 1K<n<10K
source_datasets:
- extended
task_categories:
- image-classification
task_ids: []
paperswithcode_id: imagenette
pretty_name: Imagenette
---
# Dataset Card for Imagenette
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** https://github.com/fastai/imagenette
- **Repository:** https://github.com/fastai/imagenette
- **Leaderboard:** https://paperswithcode.com/sota/image-classification-on-imagenette
### Dataset Summary
A smaller subset of 10 easily classified classes from [Imagenet](https://huggingface.co/datasets/imagenet-1k#dataset-summary), and a little more French.
This dataset was created by [Jeremy Howard](https://twitter.com/jeremyphoward), and this repository is only there to share his work on this platform. The repository owner takes no credit of any kind in the creation, curation or packaging of the dataset.
### Supported Tasks and Leaderboards
- `image-classification`: The dataset can be used to train a model for Image Classification.
### Languages
The class labels in the dataset are in English.
## Dataset Structure
### Data Instances
A data point comprises an image URL and its classification label.
```
{
'image': <PIL.JpegImagePlugin.JpegImageFile image mode=RGB size=320x320 at 0x19FA12186D8>,
'label': 'tench',
}
```
### Data Fields
- `image`: A `PIL.Image.Image` object containing the image.
- `label`: the expected class label of the image.
### Data Splits
| |train|validation|
|----------|----:|---------:|
|imagenette| 9469| 3925|
## Dataset Creation
### Curation Rationale
cf. https://huggingface.co/datasets/imagenet-1k#curation-rationale
### Source Data
#### Initial Data Collection and Normalization
Imagenette is a subset of [ImageNet](https://huggingface.co/datasets/imagenet-1k). Information about data collection of the source data can be found [here](https://huggingface.co/datasets/imagenet-1k#initial-data-collection-and-normalization).
### Annotations
#### Annotation process
cf. https://huggingface.co/datasets/imagenet-1k#annotation-process
#### Who are the annotators?
cf. https://huggingface.co/datasets/imagenet-1k#who-are-the-annotators
### Personal and Sensitive Information
cf. https://huggingface.co/datasets/imagenet-1k#personal-and-sensitive-information
## Considerations for Using the Data
### Social Impact of Dataset
cf. https://huggingface.co/datasets/imagenet-1k#social-impact-of-dataset
### Discussion of Biases
cf. https://huggingface.co/datasets/imagenet-1k#discussion-of-biases
### Other Known Limitations
cf. https://huggingface.co/datasets/imagenet-1k#other-known-limitations
## Additional Information
### Dataset Curators
cf. https://huggingface.co/datasets/imagenet-1k#dataset-curators
and Jeremy Howard
### Licensing Information
[Apache License 2.0](https://www.apache.org/licenses/LICENSE-2.0).
### Citation Information
```
@software{Howard_Imagenette_2019,
title={Imagenette: A smaller subset of 10 easily classified classes from Imagenet},
author={Jeremy Howard},
year={2019},
month={March},
publisher = {GitHub},
url = {https://github.com/fastai/imagenette}
}
```
### Contributions
This dataset was created by [Jeremy Howard](https://twitter.com/jeremyphoward) and published on [Github](https://github.com/fastai/imagenette). It was then only integrated into HuggingFace Datasets by [@frgfm](https://huggingface.co/frgfm).
|
open-llm-leaderboard-old/details_ResplendentAI__Paradigm_7B | open-llm-leaderboard-old | "2024-03-10T02:25:22Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T02:25:01Z" | ---
pretty_name: Evaluation run of ResplendentAI/Paradigm_7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [ResplendentAI/Paradigm_7B](https://huggingface.co/ResplendentAI/Paradigm_7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_ResplendentAI__Paradigm_7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T02:22:46.757217](https://huggingface.co/datasets/open-llm-leaderboard/details_ResplendentAI__Paradigm_7B/blob/main/results_2024-03-10T02-22-46.757217.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6465674374221382,\n\
\ \"acc_stderr\": 0.03224243076571729,\n \"acc_norm\": 0.6459602299418657,\n\
\ \"acc_norm_stderr\": 0.03291540888998097,\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.01719383581209389,\n \"mc2\": 0.7519096223392907,\n\
\ \"mc2_stderr\": 0.014233729965790623\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.71160409556314,\n \"acc_stderr\": 0.013238394422428171,\n\
\ \"acc_norm\": 0.7363481228668942,\n \"acc_norm_stderr\": 0.012875929151297044\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.727046405098586,\n\
\ \"acc_stderr\": 0.004445667638734141,\n \"acc_norm\": 0.8865763792073292,\n\
\ \"acc_norm_stderr\": 0.0031646183947831807\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.042320736951515885,\n \"acc_norm\": 0.6,\n \
\ \"acc_norm_stderr\": 0.042320736951515885\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7236842105263158,\n \"acc_stderr\": 0.036390575699529276,\n\
\ \"acc_norm\": 0.7236842105263158,\n \"acc_norm_stderr\": 0.036390575699529276\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7094339622641509,\n \"acc_stderr\": 0.02794321998933712,\n\
\ \"acc_norm\": 0.7094339622641509,\n \"acc_norm_stderr\": 0.02794321998933712\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.54,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6416184971098265,\n\
\ \"acc_stderr\": 0.036563436533531585,\n \"acc_norm\": 0.6416184971098265,\n\
\ \"acc_norm_stderr\": 0.036563436533531585\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107224,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107224\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5617021276595745,\n \"acc_stderr\": 0.03243618636108101,\n\
\ \"acc_norm\": 0.5617021276595745,\n \"acc_norm_stderr\": 0.03243618636108101\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7774193548387097,\n\
\ \"acc_stderr\": 0.023664216671642514,\n \"acc_norm\": 0.7774193548387097,\n\
\ \"acc_norm_stderr\": 0.023664216671642514\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n\
\ \"acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621505,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621505\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6641025641025641,\n \"acc_stderr\": 0.023946724741563976,\n\
\ \"acc_norm\": 0.6641025641025641,\n \"acc_norm_stderr\": 0.023946724741563976\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.030388353551886793,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.030388353551886793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163227,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163227\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.4861111111111111,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.4861111111111111,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8235294117647058,\n \"acc_stderr\": 0.026756401538078966,\n \"\
acc_norm\": 0.8235294117647058,\n \"acc_norm_stderr\": 0.026756401538078966\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290902,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290902\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.031381476375754995,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.031381476375754995\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7938931297709924,\n \"acc_stderr\": 0.03547771004159465,\n\
\ \"acc_norm\": 0.7938931297709924,\n \"acc_norm_stderr\": 0.03547771004159465\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.0401910747255735,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.0401910747255735\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.67,\n \"acc_stderr\": 0.047258156262526094,\n \
\ \"acc_norm\": 0.67,\n \"acc_norm_stderr\": 0.047258156262526094\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8212005108556832,\n\
\ \"acc_stderr\": 0.013702643715368985,\n \"acc_norm\": 0.8212005108556832,\n\
\ \"acc_norm_stderr\": 0.013702643715368985\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7341040462427746,\n \"acc_stderr\": 0.02378620325550829,\n\
\ \"acc_norm\": 0.7341040462427746,\n \"acc_norm_stderr\": 0.02378620325550829\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.44581005586592176,\n\
\ \"acc_stderr\": 0.01662399851333311,\n \"acc_norm\": 0.44581005586592176,\n\
\ \"acc_norm_stderr\": 0.01662399851333311\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7287581699346405,\n \"acc_stderr\": 0.025457756696667878,\n\
\ \"acc_norm\": 0.7287581699346405,\n \"acc_norm_stderr\": 0.025457756696667878\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537365,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537365\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.01274307294265335,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.01274307294265335\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.673202614379085,\n \"acc_stderr\": 0.018975427920507208,\n \
\ \"acc_norm\": 0.673202614379085,\n \"acc_norm_stderr\": 0.018975427920507208\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8258706467661692,\n\
\ \"acc_stderr\": 0.026814951200421603,\n \"acc_norm\": 0.8258706467661692,\n\
\ \"acc_norm_stderr\": 0.026814951200421603\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.5936352509179926,\n\
\ \"mc1_stderr\": 0.01719383581209389,\n \"mc2\": 0.7519096223392907,\n\
\ \"mc2_stderr\": 0.014233729965790623\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8453038674033149,\n \"acc_stderr\": 0.010163172650433537\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6679302501895376,\n \
\ \"acc_stderr\": 0.012972465034361858\n }\n}\n```"
repo_url: https://huggingface.co/ResplendentAI/Paradigm_7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|arc:challenge|25_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|gsm8k|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hellaswag|10_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-22-46.757217.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T02-22-46.757217.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- '**/details_harness|winogrande|5_2024-03-10T02-22-46.757217.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T02-22-46.757217.parquet'
- config_name: results
data_files:
- split: 2024_03_10T02_22_46.757217
path:
- results_2024-03-10T02-22-46.757217.parquet
- split: latest
path:
- results_2024-03-10T02-22-46.757217.parquet
---
# Dataset Card for Evaluation run of ResplendentAI/Paradigm_7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [ResplendentAI/Paradigm_7B](https://huggingface.co/ResplendentAI/Paradigm_7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_ResplendentAI__Paradigm_7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T02:22:46.757217](https://huggingface.co/datasets/open-llm-leaderboard/details_ResplendentAI__Paradigm_7B/blob/main/results_2024-03-10T02-22-46.757217.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6465674374221382,
"acc_stderr": 0.03224243076571729,
"acc_norm": 0.6459602299418657,
"acc_norm_stderr": 0.03291540888998097,
"mc1": 0.5936352509179926,
"mc1_stderr": 0.01719383581209389,
"mc2": 0.7519096223392907,
"mc2_stderr": 0.014233729965790623
},
"harness|arc:challenge|25": {
"acc": 0.71160409556314,
"acc_stderr": 0.013238394422428171,
"acc_norm": 0.7363481228668942,
"acc_norm_stderr": 0.012875929151297044
},
"harness|hellaswag|10": {
"acc": 0.727046405098586,
"acc_stderr": 0.004445667638734141,
"acc_norm": 0.8865763792073292,
"acc_norm_stderr": 0.0031646183947831807
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.042320736951515885,
"acc_norm": 0.6,
"acc_norm_stderr": 0.042320736951515885
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7236842105263158,
"acc_stderr": 0.036390575699529276,
"acc_norm": 0.7236842105263158,
"acc_norm_stderr": 0.036390575699529276
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7094339622641509,
"acc_stderr": 0.02794321998933712,
"acc_norm": 0.7094339622641509,
"acc_norm_stderr": 0.02794321998933712
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.036563436533531585,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.036563436533531585
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107224,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107224
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5617021276595745,
"acc_stderr": 0.03243618636108101,
"acc_norm": 0.5617021276595745,
"acc_norm_stderr": 0.03243618636108101
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642514,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642514
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621505,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621505
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6641025641025641,
"acc_stderr": 0.023946724741563976,
"acc_norm": 0.6641025641025641,
"acc_norm_stderr": 0.023946724741563976
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.030388353551886793,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.030388353551886793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163227,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163227
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.4861111111111111,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.4861111111111111,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8235294117647058,
"acc_stderr": 0.026756401538078966,
"acc_norm": 0.8235294117647058,
"acc_norm_stderr": 0.026756401538078966
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290902,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290902
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.031381476375754995,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.031381476375754995
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7938931297709924,
"acc_stderr": 0.03547771004159465,
"acc_norm": 0.7938931297709924,
"acc_norm_stderr": 0.03547771004159465
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.0401910747255735,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.0401910747255735
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8212005108556832,
"acc_stderr": 0.013702643715368985,
"acc_norm": 0.8212005108556832,
"acc_norm_stderr": 0.013702643715368985
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7341040462427746,
"acc_stderr": 0.02378620325550829,
"acc_norm": 0.7341040462427746,
"acc_norm_stderr": 0.02378620325550829
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.44581005586592176,
"acc_stderr": 0.01662399851333311,
"acc_norm": 0.44581005586592176,
"acc_norm_stderr": 0.01662399851333311
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7287581699346405,
"acc_stderr": 0.025457756696667878,
"acc_norm": 0.7287581699346405,
"acc_norm_stderr": 0.025457756696667878
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537365,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537365
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.01274307294265335,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.01274307294265335
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.673202614379085,
"acc_stderr": 0.018975427920507208,
"acc_norm": 0.673202614379085,
"acc_norm_stderr": 0.018975427920507208
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8258706467661692,
"acc_stderr": 0.026814951200421603,
"acc_norm": 0.8258706467661692,
"acc_norm_stderr": 0.026814951200421603
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.5936352509179926,
"mc1_stderr": 0.01719383581209389,
"mc2": 0.7519096223392907,
"mc2_stderr": 0.014233729965790623
},
"harness|winogrande|5": {
"acc": 0.8453038674033149,
"acc_stderr": 0.010163172650433537
},
"harness|gsm8k|5": {
"acc": 0.6679302501895376,
"acc_stderr": 0.012972465034361858
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ch08931/GabrielC | ch08931 | "2024-03-10T02:32:33Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T02:32:09Z" | ---
license: openrail
---
|
open-llm-leaderboard-old/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO | open-llm-leaderboard-old | "2024-03-10T02:38:02Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T02:37:40Z" | ---
pretty_name: Evaluation run of Locutusque/ChatHercules-2.5-Mistral-7B-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/ChatHercules-2.5-Mistral-7B-DPO](https://huggingface.co/Locutusque/ChatHercules-2.5-Mistral-7B-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T02:35:25.349975](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO/blob/main/results_2024-03-10T02-35-25.349975.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.65434321085394,\n\
\ \"acc_stderr\": 0.031825284831705845,\n \"acc_norm\": 0.655257152354157,\n\
\ \"acc_norm_stderr\": 0.03247573899311495,\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.522996054505985,\n\
\ \"mc2_stderr\": 0.014861512019306897\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.621160409556314,\n \"acc_stderr\": 0.014175915490000324,\n\
\ \"acc_norm\": 0.6604095563139932,\n \"acc_norm_stderr\": 0.01383903976282017\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6532563234415455,\n\
\ \"acc_stderr\": 0.004749606196363343,\n \"acc_norm\": 0.8540131447918742,\n\
\ \"acc_norm_stderr\": 0.0035237141526513\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.62,\n\
\ \"acc_stderr\": 0.048783173121456316,\n \"acc_norm\": 0.62,\n \
\ \"acc_norm_stderr\": 0.048783173121456316\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724053,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724053\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.46,\n \"acc_stderr\": 0.05009082659620332,\n \
\ \"acc_norm\": 0.46,\n \"acc_norm_stderr\": 0.05009082659620332\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6994219653179191,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.6994219653179191,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909284,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909284\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5957446808510638,\n \"acc_stderr\": 0.03208115750788684,\n\
\ \"acc_norm\": 0.5957446808510638,\n \"acc_norm_stderr\": 0.03208115750788684\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8032258064516129,\n \"acc_stderr\": 0.022616409420742025,\n \"\
acc_norm\": 0.8032258064516129,\n \"acc_norm_stderr\": 0.022616409420742025\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.03031371053819889,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.03031371053819889\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3814814814814815,\n \"acc_stderr\": 0.029616718927497593,\n \
\ \"acc_norm\": 0.3814814814814815,\n \"acc_norm_stderr\": 0.029616718927497593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7100840336134454,\n \"acc_stderr\": 0.029472485833136098,\n\
\ \"acc_norm\": 0.7100840336134454,\n \"acc_norm_stderr\": 0.029472485833136098\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8513761467889909,\n \"acc_stderr\": 0.015251253773660834,\n \"\
acc_norm\": 0.8513761467889909,\n \"acc_norm_stderr\": 0.015251253773660834\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.02732547096671631,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.02732547096671631\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.02574490253229092,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.02574490253229092\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8846153846153846,\n\
\ \"acc_stderr\": 0.02093019318517933,\n \"acc_norm\": 0.8846153846153846,\n\
\ \"acc_norm_stderr\": 0.02093019318517933\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.78,\n \"acc_stderr\": 0.04163331998932263,\n \
\ \"acc_norm\": 0.78,\n \"acc_norm_stderr\": 0.04163331998932263\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8326947637292464,\n\
\ \"acc_stderr\": 0.013347327202920332,\n \"acc_norm\": 0.8326947637292464,\n\
\ \"acc_norm_stderr\": 0.013347327202920332\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7543352601156069,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.7543352601156069,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3206703910614525,\n\
\ \"acc_stderr\": 0.015609929559348406,\n \"acc_norm\": 0.3206703910614525,\n\
\ \"acc_norm_stderr\": 0.015609929559348406\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824782,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824782\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7009646302250804,\n\
\ \"acc_stderr\": 0.02600330111788514,\n \"acc_norm\": 0.7009646302250804,\n\
\ \"acc_norm_stderr\": 0.02600330111788514\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7623456790123457,\n \"acc_stderr\": 0.02368359183700856,\n\
\ \"acc_norm\": 0.7623456790123457,\n \"acc_norm_stderr\": 0.02368359183700856\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48370273794002605,\n\
\ \"acc_stderr\": 0.01276345073469982,\n \"acc_norm\": 0.48370273794002605,\n\
\ \"acc_norm_stderr\": 0.01276345073469982\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7205882352941176,\n \"acc_stderr\": 0.02725720260611494,\n\
\ \"acc_norm\": 0.7205882352941176,\n \"acc_norm_stderr\": 0.02725720260611494\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6879084967320261,\n \"acc_stderr\": 0.01874501120127766,\n \
\ \"acc_norm\": 0.6879084967320261,\n \"acc_norm_stderr\": 0.01874501120127766\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.03487350880197769,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.03487350880197769\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3537331701346389,\n\
\ \"mc1_stderr\": 0.016737814358846147,\n \"mc2\": 0.522996054505985,\n\
\ \"mc2_stderr\": 0.014861512019306897\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.819258089976322,\n \"acc_stderr\": 0.010814911009613983\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6755117513267627,\n \
\ \"acc_stderr\": 0.012896095359768111\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/ChatHercules-2.5-Mistral-7B-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|arc:challenge|25_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|gsm8k|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hellaswag|10_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T02-35-25.349975.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- '**/details_harness|winogrande|5_2024-03-10T02-35-25.349975.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T02-35-25.349975.parquet'
- config_name: results
data_files:
- split: 2024_03_10T02_35_25.349975
path:
- results_2024-03-10T02-35-25.349975.parquet
- split: latest
path:
- results_2024-03-10T02-35-25.349975.parquet
---
# Dataset Card for Evaluation run of Locutusque/ChatHercules-2.5-Mistral-7B-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/ChatHercules-2.5-Mistral-7B-DPO](https://huggingface.co/Locutusque/ChatHercules-2.5-Mistral-7B-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T02:35:25.349975](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__ChatHercules-2.5-Mistral-7B-DPO/blob/main/results_2024-03-10T02-35-25.349975.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.65434321085394,
"acc_stderr": 0.031825284831705845,
"acc_norm": 0.655257152354157,
"acc_norm_stderr": 0.03247573899311495,
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.522996054505985,
"mc2_stderr": 0.014861512019306897
},
"harness|arc:challenge|25": {
"acc": 0.621160409556314,
"acc_stderr": 0.014175915490000324,
"acc_norm": 0.6604095563139932,
"acc_norm_stderr": 0.01383903976282017
},
"harness|hellaswag|10": {
"acc": 0.6532563234415455,
"acc_stderr": 0.004749606196363343,
"acc_norm": 0.8540131447918742,
"acc_norm_stderr": 0.0035237141526513
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.62,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.62,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724053,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724053
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620332,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620332
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909284,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909284
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8032258064516129,
"acc_stderr": 0.022616409420742025,
"acc_norm": 0.8032258064516129,
"acc_norm_stderr": 0.022616409420742025
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.03031371053819889,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.03031371053819889
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3814814814814815,
"acc_stderr": 0.029616718927497593,
"acc_norm": 0.3814814814814815,
"acc_norm_stderr": 0.029616718927497593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7100840336134454,
"acc_stderr": 0.029472485833136098,
"acc_norm": 0.7100840336134454,
"acc_norm_stderr": 0.029472485833136098
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8513761467889909,
"acc_stderr": 0.015251253773660834,
"acc_norm": 0.8513761467889909,
"acc_norm_stderr": 0.015251253773660834
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.02732547096671631,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.02732547096671631
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.02574490253229092,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.02574490253229092
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8846153846153846,
"acc_stderr": 0.02093019318517933,
"acc_norm": 0.8846153846153846,
"acc_norm_stderr": 0.02093019318517933
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932263,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932263
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8326947637292464,
"acc_stderr": 0.013347327202920332,
"acc_norm": 0.8326947637292464,
"acc_norm_stderr": 0.013347327202920332
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7543352601156069,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.7543352601156069,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3206703910614525,
"acc_stderr": 0.015609929559348406,
"acc_norm": 0.3206703910614525,
"acc_norm_stderr": 0.015609929559348406
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824782,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824782
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7009646302250804,
"acc_stderr": 0.02600330111788514,
"acc_norm": 0.7009646302250804,
"acc_norm_stderr": 0.02600330111788514
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7623456790123457,
"acc_stderr": 0.02368359183700856,
"acc_norm": 0.7623456790123457,
"acc_norm_stderr": 0.02368359183700856
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48370273794002605,
"acc_stderr": 0.01276345073469982,
"acc_norm": 0.48370273794002605,
"acc_norm_stderr": 0.01276345073469982
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7205882352941176,
"acc_stderr": 0.02725720260611494,
"acc_norm": 0.7205882352941176,
"acc_norm_stderr": 0.02725720260611494
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6879084967320261,
"acc_stderr": 0.01874501120127766,
"acc_norm": 0.6879084967320261,
"acc_norm_stderr": 0.01874501120127766
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.03487350880197769,
"acc_norm": 0.86,
"acc_norm_stderr": 0.03487350880197769
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3537331701346389,
"mc1_stderr": 0.016737814358846147,
"mc2": 0.522996054505985,
"mc2_stderr": 0.014861512019306897
},
"harness|winogrande|5": {
"acc": 0.819258089976322,
"acc_stderr": 0.010814911009613983
},
"harness|gsm8k|5": {
"acc": 0.6755117513267627,
"acc_stderr": 0.012896095359768111
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Milanesero/homero_simpson | Milanesero | "2024-03-10T02:49:09Z" | 0 | 0 | [
"license:bigscience-openrail-m",
"region:us"
] | null | "2024-03-10T02:47:37Z" | ---
license: bigscience-openrail-m
---
|
RomanHauksson/splat | RomanHauksson | "2024-03-10T03:33:42Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-03-10T03:32:44Z" | ---
license: mit
---
|
open-llm-leaderboard-old/details_TeeZee__GALAXY-XB-v.02 | open-llm-leaderboard-old | "2024-03-10T03:38:37Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T03:38:17Z" | ---
pretty_name: Evaluation run of TeeZee/GALAXY-XB-v.02
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/GALAXY-XB-v.02](https://huggingface.co/TeeZee/GALAXY-XB-v.02) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.02\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T03:35:59.115457](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.02/blob/main/results_2024-03-10T03-35-59.115457.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6473317192829287,\n\
\ \"acc_stderr\": 0.031970700309535464,\n \"acc_norm\": 0.6522779185123677,\n\
\ \"acc_norm_stderr\": 0.032613648259990705,\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.4359908599812855,\n\
\ \"mc2_stderr\": 0.014376822823271565\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.575938566552901,\n \"acc_stderr\": 0.014441889627464396,\n\
\ \"acc_norm\": 0.606655290102389,\n \"acc_norm_stderr\": 0.014275101465693028\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6428002389962159,\n\
\ \"acc_stderr\": 0.004781950883460501,\n \"acc_norm\": 0.8327026488747261,\n\
\ \"acc_norm_stderr\": 0.0037247833892533324\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7368421052631579,\n \"acc_stderr\": 0.03583496176361073,\n\
\ \"acc_norm\": 0.7368421052631579,\n \"acc_norm_stderr\": 0.03583496176361073\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.67,\n\
\ \"acc_stderr\": 0.047258156262526094,\n \"acc_norm\": 0.67,\n \
\ \"acc_norm_stderr\": 0.047258156262526094\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.02872750295788027,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.02872750295788027\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.05021167315686779,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.05021167315686779\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.04897104952726366,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.04897104952726366\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.593103448275862,\n \"acc_stderr\": 0.04093793981266237,\n\
\ \"acc_norm\": 0.593103448275862,\n \"acc_norm_stderr\": 0.04093793981266237\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43915343915343913,\n \"acc_stderr\": 0.025559920550531003,\n \"\
acc_norm\": 0.43915343915343913,\n \"acc_norm_stderr\": 0.025559920550531003\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.42063492063492064,\n\
\ \"acc_stderr\": 0.04415438226743744,\n \"acc_norm\": 0.42063492063492064,\n\
\ \"acc_norm_stderr\": 0.04415438226743744\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8129032258064516,\n\
\ \"acc_stderr\": 0.02218571009225225,\n \"acc_norm\": 0.8129032258064516,\n\
\ \"acc_norm_stderr\": 0.02218571009225225\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n\
\ \"acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.031922715695482995,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.031922715695482995\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8232323232323232,\n \"acc_stderr\": 0.027178752639044915,\n \"\
acc_norm\": 0.8232323232323232,\n \"acc_norm_stderr\": 0.027178752639044915\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033467,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033467\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402538,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402538\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948482,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948482\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.03017680828897434,\n \
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.03017680828897434\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8385321100917431,\n \"acc_stderr\": 0.015776239256163248,\n \"\
acc_norm\": 0.8385321100917431,\n \"acc_norm_stderr\": 0.015776239256163248\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8284313725490197,\n \"acc_stderr\": 0.026460569561240644,\n\
\ \"acc_norm\": 0.8284313725490197,\n \"acc_norm_stderr\": 0.026460569561240644\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023348,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023348\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8349514563106796,\n \"acc_stderr\": 0.036756688322331886,\n\
\ \"acc_norm\": 0.8349514563106796,\n \"acc_norm_stderr\": 0.036756688322331886\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8589743589743589,\n\
\ \"acc_stderr\": 0.022801382534597528,\n \"acc_norm\": 0.8589743589743589,\n\
\ \"acc_norm_stderr\": 0.022801382534597528\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973134,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973134\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7196531791907514,\n \"acc_stderr\": 0.024182427496577605,\n\
\ \"acc_norm\": 0.7196531791907514,\n \"acc_norm_stderr\": 0.024182427496577605\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.27039106145251396,\n\
\ \"acc_stderr\": 0.014854993938010081,\n \"acc_norm\": 0.27039106145251396,\n\
\ \"acc_norm_stderr\": 0.014854993938010081\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7647058823529411,\n \"acc_stderr\": 0.024288619466046102,\n\
\ \"acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.024288619466046102\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140453,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140453\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7253086419753086,\n \"acc_stderr\": 0.02483605786829468,\n\
\ \"acc_norm\": 0.7253086419753086,\n \"acc_norm_stderr\": 0.02483605786829468\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4869621903520209,\n\
\ \"acc_stderr\": 0.012765893883835332,\n \"acc_norm\": 0.4869621903520209,\n\
\ \"acc_norm_stderr\": 0.012765893883835332\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6862745098039216,\n \"acc_stderr\": 0.018771683893528176,\n \
\ \"acc_norm\": 0.6862745098039216,\n \"acc_norm_stderr\": 0.018771683893528176\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27906976744186046,\n\
\ \"mc1_stderr\": 0.015702107090627904,\n \"mc2\": 0.4359908599812855,\n\
\ \"mc2_stderr\": 0.014376822823271565\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8026835043409629,\n \"acc_stderr\": 0.011185026389050366\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.42077331311599697,\n \
\ \"acc_stderr\": 0.013598489497182837\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/GALAXY-XB-v.02
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|arc:challenge|25_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|gsm8k|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hellaswag|10_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T03-35-59.115457.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T03-35-59.115457.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- '**/details_harness|winogrande|5_2024-03-10T03-35-59.115457.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T03-35-59.115457.parquet'
- config_name: results
data_files:
- split: 2024_03_10T03_35_59.115457
path:
- results_2024-03-10T03-35-59.115457.parquet
- split: latest
path:
- results_2024-03-10T03-35-59.115457.parquet
---
# Dataset Card for Evaluation run of TeeZee/GALAXY-XB-v.02
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/GALAXY-XB-v.02](https://huggingface.co/TeeZee/GALAXY-XB-v.02) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.02",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T03:35:59.115457](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.02/blob/main/results_2024-03-10T03-35-59.115457.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6473317192829287,
"acc_stderr": 0.031970700309535464,
"acc_norm": 0.6522779185123677,
"acc_norm_stderr": 0.032613648259990705,
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.4359908599812855,
"mc2_stderr": 0.014376822823271565
},
"harness|arc:challenge|25": {
"acc": 0.575938566552901,
"acc_stderr": 0.014441889627464396,
"acc_norm": 0.606655290102389,
"acc_norm_stderr": 0.014275101465693028
},
"harness|hellaswag|10": {
"acc": 0.6428002389962159,
"acc_stderr": 0.004781950883460501,
"acc_norm": 0.8327026488747261,
"acc_norm_stderr": 0.0037247833892533324
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7368421052631579,
"acc_stderr": 0.03583496176361073,
"acc_norm": 0.7368421052631579,
"acc_norm_stderr": 0.03583496176361073
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.67,
"acc_stderr": 0.047258156262526094,
"acc_norm": 0.67,
"acc_norm_stderr": 0.047258156262526094
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.02872750295788027,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.02872750295788027
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686779,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686779
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.37,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.37,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.04897104952726366,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.04897104952726366
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.593103448275862,
"acc_stderr": 0.04093793981266237,
"acc_norm": 0.593103448275862,
"acc_norm_stderr": 0.04093793981266237
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43915343915343913,
"acc_stderr": 0.025559920550531003,
"acc_norm": 0.43915343915343913,
"acc_norm_stderr": 0.025559920550531003
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.04415438226743744,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.04415438226743744
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8129032258064516,
"acc_stderr": 0.02218571009225225,
"acc_norm": 0.8129032258064516,
"acc_norm_stderr": 0.02218571009225225
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.031922715695482995,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.031922715695482995
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8232323232323232,
"acc_stderr": 0.027178752639044915,
"acc_norm": 0.8232323232323232,
"acc_norm_stderr": 0.027178752639044915
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033467,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033467
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402538,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402538
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948482,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948482
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.03017680828897434,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.03017680828897434
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8385321100917431,
"acc_stderr": 0.015776239256163248,
"acc_norm": 0.8385321100917431,
"acc_norm_stderr": 0.015776239256163248
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8284313725490197,
"acc_stderr": 0.026460569561240644,
"acc_norm": 0.8284313725490197,
"acc_norm_stderr": 0.026460569561240644
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023348,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023348
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.8349514563106796,
"acc_stderr": 0.036756688322331886,
"acc_norm": 0.8349514563106796,
"acc_norm_stderr": 0.036756688322331886
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8589743589743589,
"acc_stderr": 0.022801382534597528,
"acc_norm": 0.8589743589743589,
"acc_norm_stderr": 0.022801382534597528
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973134,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973134
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7196531791907514,
"acc_stderr": 0.024182427496577605,
"acc_norm": 0.7196531791907514,
"acc_norm_stderr": 0.024182427496577605
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.27039106145251396,
"acc_stderr": 0.014854993938010081,
"acc_norm": 0.27039106145251396,
"acc_norm_stderr": 0.014854993938010081
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.024288619466046102,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.024288619466046102
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140453,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140453
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7253086419753086,
"acc_stderr": 0.02483605786829468,
"acc_norm": 0.7253086419753086,
"acc_norm_stderr": 0.02483605786829468
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4869621903520209,
"acc_stderr": 0.012765893883835332,
"acc_norm": 0.4869621903520209,
"acc_norm_stderr": 0.012765893883835332
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6862745098039216,
"acc_stderr": 0.018771683893528176,
"acc_norm": 0.6862745098039216,
"acc_norm_stderr": 0.018771683893528176
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27906976744186046,
"mc1_stderr": 0.015702107090627904,
"mc2": 0.4359908599812855,
"mc2_stderr": 0.014376822823271565
},
"harness|winogrande|5": {
"acc": 0.8026835043409629,
"acc_stderr": 0.011185026389050366
},
"harness|gsm8k|5": {
"acc": 0.42077331311599697,
"acc_stderr": 0.013598489497182837
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
gokulraj/colloquial_textbook | gokulraj | "2024-03-10T12:52:16Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T03:41:55Z" | ---
dataset_info:
features:
- name: english
dtype: string
- name: tamil
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 117358.3002610966
num_examples: 727
- name: test
num_bytes: 6295.699738903394
num_examples: 39
download_size: 56687
dataset_size: 123654.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
prompty/Furia | prompty | "2024-03-10T03:46:45Z" | 0 | 0 | [
"license:gfdl",
"size_categories:n<1K",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T03:42:27Z" | ---
license: gfdl
---
|
SauravMaheshkar/FewGLUE | SauravMaheshkar | "2024-03-18T18:23:42Z" | 0 | 1 | [
"language:en",
"size_categories:10K<n<100K",
"format:json",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"arxiv:2009.07118",
"region:us"
] | null | "2024-03-10T04:07:36Z" | ---
language:
- en
configs:
- config_name: BoolQ_train
data_files: BoolQ/train.jsonl
- config_name: BoolQ_unlabeled
data_files: BoolQ/unlabeled.jsonl
- config_name: CB_train
data_files: CB/train.jsonl
- config_name: CB_unlabeled
data_files: CB/unlabeled.jsonl
- config_name: COPA_train
data_files: COPA/train.jsonl
- config_name: COPA_unlabeled
data_files: COPA/unlabeled.jsonl
- config_name: MultiRC_train
data_files: MultiRC/train.jsonl
- config_name: MultiRC_unlabeled
data_files: MultiRC/unlabeled.jsonl
- config_name: RTE_train
data_files: RTE/train.jsonl
- config_name: RTE_unlabeled
data_files: RTE/unlabeled.jsonl
- config_name: ReCoRD_train
data_files: ReCoRD/train.jsonl
- config_name: ReCoRD_unlabeled
data_files: ReCoRD/unlabeled.jsonl
- config_name: WSC_train
data_files: WSC/train.jsonl
- config_name: WSC_unlabeled
data_files: WSC/unlabeled.jsonl
- config_name: WiC_train
data_files: WiC/train.jsonl
- config_name: WiC_unlabeled
data_files: WiC/unlabeled.jsonl
---
# [FewGLUE](https://arxiv.org/abs/2009.07118)
FewGLUE dataset, consisting of a random selection of 32 training examples from the SuperGLUE training sets and up to 20,000 unlabeled examples for each SuperGLUE task.
[Adapted from Original Repository](https://github.com/timoschick/fewglue)
## 📕 Citation
```misc
@article{schick2020small,
title={It's Not Just Size That Matters: Small Language Models Are Also Few-Shot Learners},
author={Timo Schick and Hinrich Schütze},
journal={Computing Research Repository},
volume={arXiv:2009.07118},
url={http://arxiv.org/abs/2009.07118},
year={2020}
}
```
|
open-llm-leaderboard-old/details_liminerity__M7-7b | open-llm-leaderboard-old | "2024-03-10T04:15:06Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T04:14:45Z" | ---
pretty_name: Evaluation run of liminerity/M7-7b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [liminerity/M7-7b](https://huggingface.co/liminerity/M7-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_liminerity__M7-7b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T04:12:29.468873](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__M7-7b/blob/main/results_2024-03-10T04-12-29.468873.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6518510677976063,\n\
\ \"acc_stderr\": 0.03203683002461246,\n \"acc_norm\": 0.6505703294705394,\n\
\ \"acc_norm_stderr\": 0.03271494006314174,\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7792924999444771,\n\
\ \"mc2_stderr\": 0.013713064522592473\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7167235494880546,\n \"acc_stderr\": 0.013167478735134575,\n\
\ \"acc_norm\": 0.7286689419795221,\n \"acc_norm_stderr\": 0.012993807727545796\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7177853017327226,\n\
\ \"acc_stderr\": 0.0044915745394418834,\n \"acc_norm\": 0.8914558852818164,\n\
\ \"acc_norm_stderr\": 0.003104306434972473\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6370370370370371,\n\
\ \"acc_stderr\": 0.04153948404742398,\n \"acc_norm\": 0.6370370370370371,\n\
\ \"acc_norm_stderr\": 0.04153948404742398\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7018867924528301,\n \"acc_stderr\": 0.028152837942493864,\n\
\ \"acc_norm\": 0.7018867924528301,\n \"acc_norm_stderr\": 0.028152837942493864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956911,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956911\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.57,\n \"acc_stderr\": 0.04975698519562428,\n \"acc_norm\": 0.57,\n\
\ \"acc_norm_stderr\": 0.04975698519562428\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909283,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909283\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n\
\ \"acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055273,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055273\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.02341529343356853,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.02341529343356853\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.035158955511656986,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.035158955511656986\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.03287666758603491,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.03287666758603491\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.02860620428922987,\n \"acc_norm\"\
: 0.797979797979798,\n \"acc_norm_stderr\": 0.02860620428922987\n },\n\
\ \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n \
\ \"acc\": 0.9119170984455959,\n \"acc_stderr\": 0.02045374660160103,\n\
\ \"acc_norm\": 0.9119170984455959,\n \"acc_norm_stderr\": 0.02045374660160103\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6615384615384615,\n \"acc_stderr\": 0.023991500500313036,\n\
\ \"acc_norm\": 0.6615384615384615,\n \"acc_norm_stderr\": 0.023991500500313036\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6722689075630253,\n \"acc_stderr\": 0.03048991141767323,\n \
\ \"acc_norm\": 0.6722689075630253,\n \"acc_norm_stderr\": 0.03048991141767323\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374303,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374303\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.03407632093854051,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.03407632093854051\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250447,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250447\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.025310495376944856,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.025310495376944856\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.75,\n \
\ \"acc_stderr\": 0.04186091791394607,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.04186091791394607\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7791411042944786,\n \"acc_stderr\": 0.03259177392742178,\n\
\ \"acc_norm\": 0.7791411042944786,\n \"acc_norm_stderr\": 0.03259177392742178\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406964,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406964\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.822477650063857,\n\
\ \"acc_stderr\": 0.013664230995834841,\n \"acc_norm\": 0.822477650063857,\n\
\ \"acc_norm_stderr\": 0.013664230995834841\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.02394851290546836,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.02394851290546836\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42681564245810055,\n\
\ \"acc_stderr\": 0.016542401954631917,\n \"acc_norm\": 0.42681564245810055,\n\
\ \"acc_norm_stderr\": 0.016542401954631917\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826524,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826524\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818763,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818763\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4858156028368794,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.4858156028368794,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4765319426336376,\n\
\ \"acc_stderr\": 0.012756161942523369,\n \"acc_norm\": 0.4765319426336376,\n\
\ \"acc_norm_stderr\": 0.012756161942523369\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6801470588235294,\n \"acc_stderr\": 0.02833295951403121,\n\
\ \"acc_norm\": 0.6801470588235294,\n \"acc_norm_stderr\": 0.02833295951403121\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784596,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784596\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454125,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454125\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.631578947368421,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.7792924999444771,\n\
\ \"mc2_stderr\": 0.013713064522592473\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8476716653512234,\n \"acc_stderr\": 0.010099208246065604\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.7172100075815011,\n \
\ \"acc_stderr\": 0.012405020417873619\n }\n}\n```"
repo_url: https://huggingface.co/liminerity/M7-7b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|arc:challenge|25_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|gsm8k|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hellaswag|10_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-12-29.468873.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T04-12-29.468873.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- '**/details_harness|winogrande|5_2024-03-10T04-12-29.468873.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T04-12-29.468873.parquet'
- config_name: results
data_files:
- split: 2024_03_10T04_12_29.468873
path:
- results_2024-03-10T04-12-29.468873.parquet
- split: latest
path:
- results_2024-03-10T04-12-29.468873.parquet
---
# Dataset Card for Evaluation run of liminerity/M7-7b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [liminerity/M7-7b](https://huggingface.co/liminerity/M7-7b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_liminerity__M7-7b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T04:12:29.468873](https://huggingface.co/datasets/open-llm-leaderboard/details_liminerity__M7-7b/blob/main/results_2024-03-10T04-12-29.468873.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6518510677976063,
"acc_stderr": 0.03203683002461246,
"acc_norm": 0.6505703294705394,
"acc_norm_stderr": 0.03271494006314174,
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7792924999444771,
"mc2_stderr": 0.013713064522592473
},
"harness|arc:challenge|25": {
"acc": 0.7167235494880546,
"acc_stderr": 0.013167478735134575,
"acc_norm": 0.7286689419795221,
"acc_norm_stderr": 0.012993807727545796
},
"harness|hellaswag|10": {
"acc": 0.7177853017327226,
"acc_stderr": 0.0044915745394418834,
"acc_norm": 0.8914558852818164,
"acc_norm_stderr": 0.003104306434972473
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6370370370370371,
"acc_stderr": 0.04153948404742398,
"acc_norm": 0.6370370370370371,
"acc_norm_stderr": 0.04153948404742398
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7018867924528301,
"acc_stderr": 0.028152837942493864,
"acc_norm": 0.7018867924528301,
"acc_norm_stderr": 0.028152837942493864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.57,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.57,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055273,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055273
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.02341529343356853,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.02341529343356853
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.035158955511656986,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.035158955511656986
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.03287666758603491,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.03287666758603491
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.02860620428922987,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.02860620428922987
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9119170984455959,
"acc_stderr": 0.02045374660160103,
"acc_norm": 0.9119170984455959,
"acc_norm_stderr": 0.02045374660160103
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6615384615384615,
"acc_stderr": 0.023991500500313036,
"acc_norm": 0.6615384615384615,
"acc_norm_stderr": 0.023991500500313036
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6722689075630253,
"acc_stderr": 0.03048991141767323,
"acc_norm": 0.6722689075630253,
"acc_norm_stderr": 0.03048991141767323
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374303,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374303
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.03407632093854051,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.03407632093854051
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250447,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250447
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.025310495376944856,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.025310495376944856
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.75,
"acc_stderr": 0.04186091791394607,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04186091791394607
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7791411042944786,
"acc_stderr": 0.03259177392742178,
"acc_norm": 0.7791411042944786,
"acc_norm_stderr": 0.03259177392742178
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406964,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406964
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.822477650063857,
"acc_stderr": 0.013664230995834841,
"acc_norm": 0.822477650063857,
"acc_norm_stderr": 0.013664230995834841
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.02394851290546836,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.02394851290546836
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42681564245810055,
"acc_stderr": 0.016542401954631917,
"acc_norm": 0.42681564245810055,
"acc_norm_stderr": 0.016542401954631917
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826524,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826524
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818763,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818763
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4858156028368794,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.4858156028368794,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4765319426336376,
"acc_stderr": 0.012756161942523369,
"acc_norm": 0.4765319426336376,
"acc_norm_stderr": 0.012756161942523369
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6801470588235294,
"acc_stderr": 0.02833295951403121,
"acc_norm": 0.6801470588235294,
"acc_norm_stderr": 0.02833295951403121
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784596,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784596
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454125,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454125
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.631578947368421,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.7792924999444771,
"mc2_stderr": 0.013713064522592473
},
"harness|winogrande|5": {
"acc": 0.8476716653512234,
"acc_stderr": 0.010099208246065604
},
"harness|gsm8k|5": {
"acc": 0.7172100075815011,
"acc_stderr": 0.012405020417873619
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
anjan77/ecommerce-faq-lllama2-dataset | anjan77 | "2024-03-10T04:58:44Z" | 0 | 0 | [
"task_categories:text-classification",
"license:apache-2.0",
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"Question Answering"
] | [
"text-classification"
] | "2024-03-10T04:25:54Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 38858
num_examples: 158
download_size: 9384
dataset_size: 38858
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- text-classification
tags:
- Question Answering
--- |
TawyeebOS/llama-2-7b-roleplay-script | TawyeebOS | "2024-03-10T05:08:36Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T04:29:19Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 277533
num_examples: 570
download_size: 166147
dataset_size: 277533
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Bibek1129/nepali_SQuAD | Bibek1129 | "2024-03-10T04:53:55Z" | 0 | 1 | [
"license:cc-by-4.0",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T04:51:57Z" | ---
license: cc-by-4.0
---
|
Bibek1129/nepali_SQuAD_multiple_qsns | Bibek1129 | "2024-03-10T04:55:58Z" | 0 | 2 | [
"license:cc-by-4.0",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T04:55:21Z" | ---
license: cc-by-4.0
---
|
Bibek1129/nepali_SQuAD_single_qsn | Bibek1129 | "2024-03-10T04:57:13Z" | 0 | 1 | [
"license:cc-by-4.0",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T04:56:36Z" | ---
license: cc-by-4.0
---
|
open-llm-leaderboard-old/details_Locutusque__Hyperion-2.0-Mistral-7B | open-llm-leaderboard-old | "2024-03-10T05:54:50Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T04:57:31Z" | ---
pretty_name: Evaluation run of Locutusque/Hyperion-2.0-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Locutusque/Hyperion-2.0-Mistral-7B](https://huggingface.co/Locutusque/Hyperion-2.0-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Locutusque__Hyperion-2.0-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T05:52:30.143262](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hyperion-2.0-Mistral-7B/blob/main/results_2024-03-10T05-52-30.143262.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6346692637770753,\n\
\ \"acc_stderr\": 0.03232834743290968,\n \"acc_norm\": 0.6397577306836747,\n\
\ \"acc_norm_stderr\": 0.03297845242054893,\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4197149652162468,\n\
\ \"mc2_stderr\": 0.014030449483056798\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5725255972696246,\n \"acc_stderr\": 0.014456862944650649,\n\
\ \"acc_norm\": 0.6109215017064846,\n \"acc_norm_stderr\": 0.014247309976045607\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6323441545508863,\n\
\ \"acc_stderr\": 0.004811815959388832,\n \"acc_norm\": 0.8349930292770364,\n\
\ \"acc_norm_stderr\": 0.0037042823907817183\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.04094376269996793,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.04094376269996793\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.04943110704237102,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.04943110704237102\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.028450154794118637,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.028450154794118637\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7291666666666666,\n\
\ \"acc_stderr\": 0.03716177437566017,\n \"acc_norm\": 0.7291666666666666,\n\
\ \"acc_norm_stderr\": 0.03716177437566017\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3431372549019608,\n \"acc_stderr\": 0.04724007352383887,\n\
\ \"acc_norm\": 0.3431372549019608,\n \"acc_norm_stderr\": 0.04724007352383887\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.8,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\": 0.8,\n\
\ \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.047028804320496165,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.047028804320496165\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.04122737111370332,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.04122737111370332\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3835978835978836,\n \"acc_stderr\": 0.025043757318520196,\n \"\
acc_norm\": 0.3835978835978836,\n \"acc_norm_stderr\": 0.025043757318520196\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.38095238095238093,\n\
\ \"acc_stderr\": 0.043435254289490965,\n \"acc_norm\": 0.38095238095238093,\n\
\ \"acc_norm_stderr\": 0.043435254289490965\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7645161290322581,\n\
\ \"acc_stderr\": 0.024137632429337714,\n \"acc_norm\": 0.7645161290322581,\n\
\ \"acc_norm_stderr\": 0.024137632429337714\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.46798029556650245,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.46798029556650245,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.033744026441394036,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.033744026441394036\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.028869778460267045,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.028869778460267045\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3592592592592593,\n \"acc_stderr\": 0.029252905927251976,\n \
\ \"acc_norm\": 0.3592592592592593,\n \"acc_norm_stderr\": 0.029252905927251976\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6512605042016807,\n \"acc_stderr\": 0.030956636328566548,\n\
\ \"acc_norm\": 0.6512605042016807,\n \"acc_norm_stderr\": 0.030956636328566548\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8275229357798165,\n \"acc_stderr\": 0.01619780795684805,\n \"\
acc_norm\": 0.8275229357798165,\n \"acc_norm_stderr\": 0.01619780795684805\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5092592592592593,\n \"acc_stderr\": 0.034093869469927006,\n \"\
acc_norm\": 0.5092592592592593,\n \"acc_norm_stderr\": 0.034093869469927006\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8137254901960784,\n \"acc_stderr\": 0.027325470966716312,\n \"\
acc_norm\": 0.8137254901960784,\n \"acc_norm_stderr\": 0.027325470966716312\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.02747974455080851,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.02747974455080851\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.03076935200822915,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.03076935200822915\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8015267175572519,\n \"acc_stderr\": 0.034981493854624714,\n\
\ \"acc_norm\": 0.8015267175572519,\n \"acc_norm_stderr\": 0.034981493854624714\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7768595041322314,\n \"acc_stderr\": 0.03800754475228732,\n \"\
acc_norm\": 0.7768595041322314,\n \"acc_norm_stderr\": 0.03800754475228732\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7592592592592593,\n\
\ \"acc_stderr\": 0.04133119440243838,\n \"acc_norm\": 0.7592592592592593,\n\
\ \"acc_norm_stderr\": 0.04133119440243838\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.8098159509202454,\n \"acc_stderr\": 0.030833491146281245,\n\
\ \"acc_norm\": 0.8098159509202454,\n \"acc_norm_stderr\": 0.030833491146281245\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.72,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8122605363984674,\n\
\ \"acc_stderr\": 0.013964393769899136,\n \"acc_norm\": 0.8122605363984674,\n\
\ \"acc_norm_stderr\": 0.013964393769899136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.708092485549133,\n \"acc_stderr\": 0.02447699407624733,\n\
\ \"acc_norm\": 0.708092485549133,\n \"acc_norm_stderr\": 0.02447699407624733\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2994413407821229,\n\
\ \"acc_stderr\": 0.015318257745976708,\n \"acc_norm\": 0.2994413407821229,\n\
\ \"acc_norm_stderr\": 0.015318257745976708\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7549019607843137,\n \"acc_stderr\": 0.024630048979824775,\n\
\ \"acc_norm\": 0.7549019607843137,\n \"acc_norm_stderr\": 0.024630048979824775\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818774,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818774\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7191358024691358,\n \"acc_stderr\": 0.025006469755799208,\n\
\ \"acc_norm\": 0.7191358024691358,\n \"acc_norm_stderr\": 0.025006469755799208\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4576271186440678,\n\
\ \"acc_stderr\": 0.012724296550980188,\n \"acc_norm\": 0.4576271186440678,\n\
\ \"acc_norm_stderr\": 0.012724296550980188\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6503267973856209,\n \"acc_stderr\": 0.01929196189506638,\n \
\ \"acc_norm\": 0.6503267973856209,\n \"acc_norm_stderr\": 0.01929196189506638\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.0377525168068637,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.0377525168068637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727665,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727665\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.27539779681762544,\n\
\ \"mc1_stderr\": 0.01563813566777552,\n \"mc2\": 0.4197149652162468,\n\
\ \"mc2_stderr\": 0.014030449483056798\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7924230465666929,\n \"acc_stderr\": 0.011398593419386772\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4177407126611069,\n \
\ \"acc_stderr\": 0.013584820638504832\n }\n}\n```"
repo_url: https://huggingface.co/Locutusque/Hyperion-2.0-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|arc:challenge|25_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|arc:challenge|25_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|gsm8k|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|gsm8k|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hellaswag|10_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hellaswag|10_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-55-15.610547.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-52-30.143262.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T05-52-30.143262.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- '**/details_harness|winogrande|5_2024-03-10T04-55-15.610547.parquet'
- split: 2024_03_10T05_52_30.143262
path:
- '**/details_harness|winogrande|5_2024-03-10T05-52-30.143262.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T05-52-30.143262.parquet'
- config_name: results
data_files:
- split: 2024_03_10T04_55_15.610547
path:
- results_2024-03-10T04-55-15.610547.parquet
- split: 2024_03_10T05_52_30.143262
path:
- results_2024-03-10T05-52-30.143262.parquet
- split: latest
path:
- results_2024-03-10T05-52-30.143262.parquet
---
# Dataset Card for Evaluation run of Locutusque/Hyperion-2.0-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Locutusque/Hyperion-2.0-Mistral-7B](https://huggingface.co/Locutusque/Hyperion-2.0-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Locutusque__Hyperion-2.0-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T05:52:30.143262](https://huggingface.co/datasets/open-llm-leaderboard/details_Locutusque__Hyperion-2.0-Mistral-7B/blob/main/results_2024-03-10T05-52-30.143262.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6346692637770753,
"acc_stderr": 0.03232834743290968,
"acc_norm": 0.6397577306836747,
"acc_norm_stderr": 0.03297845242054893,
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4197149652162468,
"mc2_stderr": 0.014030449483056798
},
"harness|arc:challenge|25": {
"acc": 0.5725255972696246,
"acc_stderr": 0.014456862944650649,
"acc_norm": 0.6109215017064846,
"acc_norm_stderr": 0.014247309976045607
},
"harness|hellaswag|10": {
"acc": 0.6323441545508863,
"acc_stderr": 0.004811815959388832,
"acc_norm": 0.8349930292770364,
"acc_norm_stderr": 0.0037042823907817183
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.04094376269996793,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.04094376269996793
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237102,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237102
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.028450154794118637,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.028450154794118637
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7291666666666666,
"acc_stderr": 0.03716177437566017,
"acc_norm": 0.7291666666666666,
"acc_norm_stderr": 0.03716177437566017
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3431372549019608,
"acc_stderr": 0.04724007352383887,
"acc_norm": 0.3431372549019608,
"acc_norm_stderr": 0.04724007352383887
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.8,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.8,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.047028804320496165,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.047028804320496165
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.04122737111370332,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.04122737111370332
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3835978835978836,
"acc_stderr": 0.025043757318520196,
"acc_norm": 0.3835978835978836,
"acc_norm_stderr": 0.025043757318520196
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.38095238095238093,
"acc_stderr": 0.043435254289490965,
"acc_norm": 0.38095238095238093,
"acc_norm_stderr": 0.043435254289490965
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.024137632429337714,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.024137632429337714
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.46798029556650245,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.46798029556650245,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.033744026441394036,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.033744026441394036
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.028869778460267045,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.028869778460267045
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3592592592592593,
"acc_stderr": 0.029252905927251976,
"acc_norm": 0.3592592592592593,
"acc_norm_stderr": 0.029252905927251976
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6512605042016807,
"acc_stderr": 0.030956636328566548,
"acc_norm": 0.6512605042016807,
"acc_norm_stderr": 0.030956636328566548
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8275229357798165,
"acc_stderr": 0.01619780795684805,
"acc_norm": 0.8275229357798165,
"acc_norm_stderr": 0.01619780795684805
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5092592592592593,
"acc_stderr": 0.034093869469927006,
"acc_norm": 0.5092592592592593,
"acc_norm_stderr": 0.034093869469927006
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8137254901960784,
"acc_stderr": 0.027325470966716312,
"acc_norm": 0.8137254901960784,
"acc_norm_stderr": 0.027325470966716312
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.02747974455080851,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.02747974455080851
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.03076935200822915,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.03076935200822915
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8015267175572519,
"acc_stderr": 0.034981493854624714,
"acc_norm": 0.8015267175572519,
"acc_norm_stderr": 0.034981493854624714
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7768595041322314,
"acc_stderr": 0.03800754475228732,
"acc_norm": 0.7768595041322314,
"acc_norm_stderr": 0.03800754475228732
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7592592592592593,
"acc_stderr": 0.04133119440243838,
"acc_norm": 0.7592592592592593,
"acc_norm_stderr": 0.04133119440243838
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.8098159509202454,
"acc_stderr": 0.030833491146281245,
"acc_norm": 0.8098159509202454,
"acc_norm_stderr": 0.030833491146281245
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8122605363984674,
"acc_stderr": 0.013964393769899136,
"acc_norm": 0.8122605363984674,
"acc_norm_stderr": 0.013964393769899136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.708092485549133,
"acc_stderr": 0.02447699407624733,
"acc_norm": 0.708092485549133,
"acc_norm_stderr": 0.02447699407624733
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2994413407821229,
"acc_stderr": 0.015318257745976708,
"acc_norm": 0.2994413407821229,
"acc_norm_stderr": 0.015318257745976708
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7549019607843137,
"acc_stderr": 0.024630048979824775,
"acc_norm": 0.7549019607843137,
"acc_norm_stderr": 0.024630048979824775
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818774,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818774
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7191358024691358,
"acc_stderr": 0.025006469755799208,
"acc_norm": 0.7191358024691358,
"acc_norm_stderr": 0.025006469755799208
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4576271186440678,
"acc_stderr": 0.012724296550980188,
"acc_norm": 0.4576271186440678,
"acc_norm_stderr": 0.012724296550980188
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6503267973856209,
"acc_stderr": 0.01929196189506638,
"acc_norm": 0.6503267973856209,
"acc_norm_stderr": 0.01929196189506638
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.0377525168068637,
"acc_norm": 0.83,
"acc_norm_stderr": 0.0377525168068637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727665,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727665
},
"harness|truthfulqa:mc|0": {
"mc1": 0.27539779681762544,
"mc1_stderr": 0.01563813566777552,
"mc2": 0.4197149652162468,
"mc2_stderr": 0.014030449483056798
},
"harness|winogrande|5": {
"acc": 0.7924230465666929,
"acc_stderr": 0.011398593419386772
},
"harness|gsm8k|5": {
"acc": 0.4177407126611069,
"acc_stderr": 0.013584820638504832
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_TeeZee__DarkSapling-7B-v2.0 | open-llm-leaderboard-old | "2024-03-10T04:59:49Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T04:59:27Z" | ---
pretty_name: Evaluation run of TeeZee/DarkSapling-7B-v2.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/DarkSapling-7B-v2.0](https://huggingface.co/TeeZee/DarkSapling-7B-v2.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v2.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T04:57:12.333081](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v2.0/blob/main/results_2024-03-10T04-57-12.333081.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6424579193008534,\n\
\ \"acc_stderr\": 0.032218866498356466,\n \"acc_norm\": 0.6471361795899754,\n\
\ \"acc_norm_stderr\": 0.032858397843778114,\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5221487837375264,\n\
\ \"mc2_stderr\": 0.015253502717954797\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6023890784982935,\n \"acc_stderr\": 0.01430175222327954,\n\
\ \"acc_norm\": 0.6416382252559727,\n \"acc_norm_stderr\": 0.014012883334859857\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6589324835690101,\n\
\ \"acc_stderr\": 0.004730991357194292,\n \"acc_norm\": 0.8510256920932086,\n\
\ \"acc_norm_stderr\": 0.003553354528132355\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6710526315789473,\n \"acc_stderr\": 0.03823428969926604,\n\
\ \"acc_norm\": 0.6710526315789473,\n \"acc_norm_stderr\": 0.03823428969926604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.028049186315695248,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.028049186315695248\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7361111111111112,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.7361111111111112,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.53,\n \"acc_stderr\": 0.05016135580465919,\n \"acc_norm\": 0.53,\n\
\ \"acc_norm_stderr\": 0.05016135580465919\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.0368122963339432,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.0368122963339432\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.042295258468165065,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.042295258468165065\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5914893617021276,\n \"acc_stderr\": 0.032134180267015755,\n\
\ \"acc_norm\": 0.5914893617021276,\n \"acc_norm_stderr\": 0.032134180267015755\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.0470070803355104,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.0470070803355104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5862068965517241,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.5862068965517241,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3994708994708995,\n \"acc_stderr\": 0.025225450284067887,\n \"\
acc_norm\": 0.3994708994708995,\n \"acc_norm_stderr\": 0.025225450284067887\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4126984126984127,\n\
\ \"acc_stderr\": 0.04403438954768177,\n \"acc_norm\": 0.4126984126984127,\n\
\ \"acc_norm_stderr\": 0.04403438954768177\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7709677419354839,\n \"acc_stderr\": 0.023904914311782648,\n \"\
acc_norm\": 0.7709677419354839,\n \"acc_norm_stderr\": 0.023904914311782648\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.03158415324047711,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.03158415324047711\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463355,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463355\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.022935144053919443,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.022935144053919443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.362962962962963,\n \"acc_stderr\": 0.029318203645206865,\n \
\ \"acc_norm\": 0.362962962962963,\n \"acc_norm_stderr\": 0.029318203645206865\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6680672268907563,\n \"acc_stderr\": 0.03058869701378364,\n \
\ \"acc_norm\": 0.6680672268907563,\n \"acc_norm_stderr\": 0.03058869701378364\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8238532110091743,\n \"acc_stderr\": 0.016332882393431385,\n \"\
acc_norm\": 0.8238532110091743,\n \"acc_norm_stderr\": 0.016332882393431385\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437406,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437406\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7805907172995781,\n \"acc_stderr\": 0.026939106581553945,\n \
\ \"acc_norm\": 0.7805907172995781,\n \"acc_norm_stderr\": 0.026939106581553945\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6771300448430493,\n\
\ \"acc_stderr\": 0.03138147637575499,\n \"acc_norm\": 0.6771300448430493,\n\
\ \"acc_norm_stderr\": 0.03138147637575499\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7709923664122137,\n \"acc_stderr\": 0.036853466317118506,\n\
\ \"acc_norm\": 0.7709923664122137,\n \"acc_norm_stderr\": 0.036853466317118506\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8760683760683761,\n\
\ \"acc_stderr\": 0.02158649400128139,\n \"acc_norm\": 0.8760683760683761,\n\
\ \"acc_norm_stderr\": 0.02158649400128139\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816508,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816508\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8173690932311622,\n\
\ \"acc_stderr\": 0.013816335389973136,\n \"acc_norm\": 0.8173690932311622,\n\
\ \"acc_norm_stderr\": 0.013816335389973136\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.024027745155265012,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.024027745155265012\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3486033519553073,\n\
\ \"acc_stderr\": 0.015937484656687033,\n \"acc_norm\": 0.3486033519553073,\n\
\ \"acc_norm_stderr\": 0.015937484656687033\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7516339869281046,\n \"acc_stderr\": 0.02473998135511359,\n\
\ \"acc_norm\": 0.7516339869281046,\n \"acc_norm_stderr\": 0.02473998135511359\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7283950617283951,\n \"acc_stderr\": 0.024748624490537368,\n\
\ \"acc_norm\": 0.7283950617283951,\n \"acc_norm_stderr\": 0.024748624490537368\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.48226950354609927,\n \"acc_stderr\": 0.02980873964223777,\n \
\ \"acc_norm\": 0.48226950354609927,\n \"acc_norm_stderr\": 0.02980873964223777\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4491525423728814,\n\
\ \"acc_stderr\": 0.012704030518851488,\n \"acc_norm\": 0.4491525423728814,\n\
\ \"acc_norm_stderr\": 0.012704030518851488\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.028418208619406755,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.028418208619406755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468712,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468712\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7346938775510204,\n \"acc_stderr\": 0.028263889943784593,\n\
\ \"acc_norm\": 0.7346938775510204,\n \"acc_norm_stderr\": 0.028263889943784593\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.026508590656233268,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.026508590656233268\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5602409638554217,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.5602409638554217,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8245614035087719,\n \"acc_stderr\": 0.029170885500727668,\n\
\ \"acc_norm\": 0.8245614035087719,\n \"acc_norm_stderr\": 0.029170885500727668\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3659730722154223,\n\
\ \"mc1_stderr\": 0.016862941684088376,\n \"mc2\": 0.5221487837375264,\n\
\ \"mc2_stderr\": 0.015253502717954797\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7861089187056038,\n \"acc_stderr\": 0.011524466954090248\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4541319181197877,\n \
\ \"acc_stderr\": 0.01371441094526456\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/DarkSapling-7B-v2.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|arc:challenge|25_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|gsm8k|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hellaswag|10_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-57-12.333081.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T04-57-12.333081.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- '**/details_harness|winogrande|5_2024-03-10T04-57-12.333081.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T04-57-12.333081.parquet'
- config_name: results
data_files:
- split: 2024_03_10T04_57_12.333081
path:
- results_2024-03-10T04-57-12.333081.parquet
- split: latest
path:
- results_2024-03-10T04-57-12.333081.parquet
---
# Dataset Card for Evaluation run of TeeZee/DarkSapling-7B-v2.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/DarkSapling-7B-v2.0](https://huggingface.co/TeeZee/DarkSapling-7B-v2.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v2.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T04:57:12.333081](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__DarkSapling-7B-v2.0/blob/main/results_2024-03-10T04-57-12.333081.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6424579193008534,
"acc_stderr": 0.032218866498356466,
"acc_norm": 0.6471361795899754,
"acc_norm_stderr": 0.032858397843778114,
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5221487837375264,
"mc2_stderr": 0.015253502717954797
},
"harness|arc:challenge|25": {
"acc": 0.6023890784982935,
"acc_stderr": 0.01430175222327954,
"acc_norm": 0.6416382252559727,
"acc_norm_stderr": 0.014012883334859857
},
"harness|hellaswag|10": {
"acc": 0.6589324835690101,
"acc_stderr": 0.004730991357194292,
"acc_norm": 0.8510256920932086,
"acc_norm_stderr": 0.003553354528132355
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6710526315789473,
"acc_stderr": 0.03823428969926604,
"acc_norm": 0.6710526315789473,
"acc_norm_stderr": 0.03823428969926604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.028049186315695248,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.028049186315695248
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7361111111111112,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.7361111111111112,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.53,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.53,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.0368122963339432,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.0368122963339432
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.042295258468165065,
"acc_norm": 0.77,
"acc_norm_stderr": 0.042295258468165065
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5914893617021276,
"acc_stderr": 0.032134180267015755,
"acc_norm": 0.5914893617021276,
"acc_norm_stderr": 0.032134180267015755
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.0470070803355104,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.0470070803355104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5862068965517241,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.5862068965517241,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3994708994708995,
"acc_stderr": 0.025225450284067887,
"acc_norm": 0.3994708994708995,
"acc_norm_stderr": 0.025225450284067887
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.04403438954768177,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.04403438954768177
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7709677419354839,
"acc_stderr": 0.023904914311782648,
"acc_norm": 0.7709677419354839,
"acc_norm_stderr": 0.023904914311782648
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.03158415324047711,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.03158415324047711
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463355,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463355
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.022935144053919443,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.022935144053919443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.029318203645206865,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.029318203645206865
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6680672268907563,
"acc_stderr": 0.03058869701378364,
"acc_norm": 0.6680672268907563,
"acc_norm_stderr": 0.03058869701378364
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8238532110091743,
"acc_stderr": 0.016332882393431385,
"acc_norm": 0.8238532110091743,
"acc_norm_stderr": 0.016332882393431385
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437406,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437406
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7805907172995781,
"acc_stderr": 0.026939106581553945,
"acc_norm": 0.7805907172995781,
"acc_norm_stderr": 0.026939106581553945
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6771300448430493,
"acc_stderr": 0.03138147637575499,
"acc_norm": 0.6771300448430493,
"acc_norm_stderr": 0.03138147637575499
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7709923664122137,
"acc_stderr": 0.036853466317118506,
"acc_norm": 0.7709923664122137,
"acc_norm_stderr": 0.036853466317118506
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8760683760683761,
"acc_stderr": 0.02158649400128139,
"acc_norm": 0.8760683760683761,
"acc_norm_stderr": 0.02158649400128139
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816508,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816508
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8173690932311622,
"acc_stderr": 0.013816335389973136,
"acc_norm": 0.8173690932311622,
"acc_norm_stderr": 0.013816335389973136
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.024027745155265012,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.024027745155265012
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3486033519553073,
"acc_stderr": 0.015937484656687033,
"acc_norm": 0.3486033519553073,
"acc_norm_stderr": 0.015937484656687033
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7516339869281046,
"acc_stderr": 0.02473998135511359,
"acc_norm": 0.7516339869281046,
"acc_norm_stderr": 0.02473998135511359
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7283950617283951,
"acc_stderr": 0.024748624490537368,
"acc_norm": 0.7283950617283951,
"acc_norm_stderr": 0.024748624490537368
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.48226950354609927,
"acc_stderr": 0.02980873964223777,
"acc_norm": 0.48226950354609927,
"acc_norm_stderr": 0.02980873964223777
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4491525423728814,
"acc_stderr": 0.012704030518851488,
"acc_norm": 0.4491525423728814,
"acc_norm_stderr": 0.012704030518851488
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.028418208619406755,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.028418208619406755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468712,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468712
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7346938775510204,
"acc_stderr": 0.028263889943784593,
"acc_norm": 0.7346938775510204,
"acc_norm_stderr": 0.028263889943784593
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.026508590656233268,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.026508590656233268
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5602409638554217,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.5602409638554217,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8245614035087719,
"acc_stderr": 0.029170885500727668,
"acc_norm": 0.8245614035087719,
"acc_norm_stderr": 0.029170885500727668
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3659730722154223,
"mc1_stderr": 0.016862941684088376,
"mc2": 0.5221487837375264,
"mc2_stderr": 0.015253502717954797
},
"harness|winogrande|5": {
"acc": 0.7861089187056038,
"acc_stderr": 0.011524466954090248
},
"harness|gsm8k|5": {
"acc": 0.4541319181197877,
"acc_stderr": 0.01371441094526456
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
kaist-ai/ntp_nsp_cosupervision_train | kaist-ai | "2024-03-10T04:59:49Z" | 0 | 1 | [
"license:cc-by-4.0",
"region:us"
] | null | "2024-03-10T04:59:48Z" | ---
license: cc-by-4.0
---
|
svjack/genshin-impact-character-image | svjack | "2024-03-10T05:25:38Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T05:25:21Z" | ---
dataset_info:
features:
- name: name
dtype: string
- name: img
dtype: image
splits:
- name: train
num_bytes: 7449633.0
num_examples: 75
download_size: 7452316
dataset_size: 7449633.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
TrevorDohm/Stack_Tokenized | TrevorDohm | "2024-04-16T00:19:53Z" | 0 | 0 | [
"task_categories:text-generation",
"language_creators:crowdsourced",
"language_creators:expert-generated",
"multilinguality:multilingual",
"language:code",
"license:other",
"size_categories:100M<n<1B",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"text-generation"
] | "2024-03-10T05:49:00Z" | ---
annotations_creators: []
language_creators:
- crowdsourced
- expert-generated
language:
- code
license:
- other
multilinguality:
- multilingual
pretty_name: The-Stack-Tokenized
size_categories:
- unknown
source_datasets: []
task_categories:
- text-generation
task_ids: []
---
|
open-llm-leaderboard-old/details_OEvortex__HelpingAI-Lite-1.5T | open-llm-leaderboard-old | "2024-03-10T06:19:18Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T06:18:56Z" | ---
pretty_name: Evaluation run of OEvortex/HelpingAI-Lite-1.5T
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OEvortex/HelpingAI-Lite-1.5T](https://huggingface.co/OEvortex/HelpingAI-Lite-1.5T)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T06:17:09.699346](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T/blob/main/results_2024-03-10T06-17-09.699346.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2635465581409758,\n\
\ \"acc_stderr\": 0.031199778547091002,\n \"acc_norm\": 0.26467294429469646,\n\
\ \"acc_norm_stderr\": 0.03197040307669128,\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3861173734844904,\n\
\ \"mc2_stderr\": 0.014144546234841945\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.28924914675767915,\n \"acc_stderr\": 0.013250012579393443,\n\
\ \"acc_norm\": 0.3122866894197952,\n \"acc_norm_stderr\": 0.013542598541688065\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.40838478390758814,\n\
\ \"acc_stderr\": 0.00490530437109087,\n \"acc_norm\": 0.5238996215893248,\n\
\ \"acc_norm_stderr\": 0.004984077906216095\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.28888888888888886,\n\
\ \"acc_stderr\": 0.03915450630414251,\n \"acc_norm\": 0.28888888888888886,\n\
\ \"acc_norm_stderr\": 0.03915450630414251\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.20394736842105263,\n \"acc_stderr\": 0.032790004063100515,\n\
\ \"acc_norm\": 0.20394736842105263,\n \"acc_norm_stderr\": 0.032790004063100515\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.33,\n\
\ \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n \
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2528301886792453,\n \"acc_stderr\": 0.026749899771241238,\n\
\ \"acc_norm\": 0.2528301886792453,\n \"acc_norm_stderr\": 0.026749899771241238\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.03396116205845335,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.03396116205845335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.27,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\"\
: 0.27,\n \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.21965317919075145,\n\
\ \"acc_stderr\": 0.031568093627031744,\n \"acc_norm\": 0.21965317919075145,\n\
\ \"acc_norm_stderr\": 0.031568093627031744\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.28431372549019607,\n \"acc_stderr\": 0.04488482852329017,\n\
\ \"acc_norm\": 0.28431372549019607,\n \"acc_norm_stderr\": 0.04488482852329017\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292326,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292326\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.04049339297748141,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.04049339297748141\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.25517241379310346,\n \"acc_stderr\": 0.03632984052707842,\n\
\ \"acc_norm\": 0.25517241379310346,\n \"acc_norm_stderr\": 0.03632984052707842\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25396825396825395,\n \"acc_stderr\": 0.022418042891113942,\n \"\
acc_norm\": 0.25396825396825395,\n \"acc_norm_stderr\": 0.022418042891113942\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.20634920634920634,\n\
\ \"acc_stderr\": 0.036196045241242515,\n \"acc_norm\": 0.20634920634920634,\n\
\ \"acc_norm_stderr\": 0.036196045241242515\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.25161290322580643,\n\
\ \"acc_stderr\": 0.024685979286239956,\n \"acc_norm\": 0.25161290322580643,\n\
\ \"acc_norm_stderr\": 0.024685979286239956\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233484,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233484\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \"acc_norm\"\
: 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2787878787878788,\n \"acc_stderr\": 0.03501438706296781,\n\
\ \"acc_norm\": 0.2787878787878788,\n \"acc_norm_stderr\": 0.03501438706296781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.23737373737373738,\n \"acc_stderr\": 0.0303137105381989,\n \"\
acc_norm\": 0.23737373737373738,\n \"acc_norm_stderr\": 0.0303137105381989\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.02977866303775296,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.02977866303775296\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2230769230769231,\n \"acc_stderr\": 0.02110773012724398,\n \
\ \"acc_norm\": 0.2230769230769231,\n \"acc_norm_stderr\": 0.02110773012724398\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073835,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073835\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.23109243697478993,\n \"acc_stderr\": 0.027381406927868966,\n\
\ \"acc_norm\": 0.23109243697478993,\n \"acc_norm_stderr\": 0.027381406927868966\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2185430463576159,\n \"acc_stderr\": 0.03374235550425694,\n \"\
acc_norm\": 0.2185430463576159,\n \"acc_norm_stderr\": 0.03374235550425694\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.21834862385321102,\n \"acc_stderr\": 0.017712600528722734,\n \"\
acc_norm\": 0.21834862385321102,\n \"acc_norm_stderr\": 0.017712600528722734\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.41203703703703703,\n \"acc_stderr\": 0.03356787758160835,\n \"\
acc_norm\": 0.41203703703703703,\n \"acc_norm_stderr\": 0.03356787758160835\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2107843137254902,\n \"acc_stderr\": 0.028626547912437416,\n \"\
acc_norm\": 0.2107843137254902,\n \"acc_norm_stderr\": 0.028626547912437416\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2616033755274262,\n \"acc_stderr\": 0.028609516716994934,\n \
\ \"acc_norm\": 0.2616033755274262,\n \"acc_norm_stderr\": 0.028609516716994934\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.34977578475336324,\n\
\ \"acc_stderr\": 0.03200736719484503,\n \"acc_norm\": 0.34977578475336324,\n\
\ \"acc_norm_stderr\": 0.03200736719484503\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2900763358778626,\n \"acc_stderr\": 0.03980066246467765,\n\
\ \"acc_norm\": 0.2900763358778626,\n \"acc_norm_stderr\": 0.03980066246467765\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302872,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302872\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.26851851851851855,\n\
\ \"acc_stderr\": 0.04284467968052192,\n \"acc_norm\": 0.26851851851851855,\n\
\ \"acc_norm_stderr\": 0.04284467968052192\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2822085889570552,\n \"acc_stderr\": 0.03536117886664743,\n\
\ \"acc_norm\": 0.2822085889570552,\n \"acc_norm_stderr\": 0.03536117886664743\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.25892857142857145,\n\
\ \"acc_stderr\": 0.041577515398656284,\n \"acc_norm\": 0.25892857142857145,\n\
\ \"acc_norm_stderr\": 0.041577515398656284\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.18446601941747573,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.18446601941747573,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.25213675213675213,\n\
\ \"acc_stderr\": 0.02844796547623102,\n \"acc_norm\": 0.25213675213675213,\n\
\ \"acc_norm_stderr\": 0.02844796547623102\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2707535121328225,\n\
\ \"acc_stderr\": 0.01588988836256049,\n \"acc_norm\": 0.2707535121328225,\n\
\ \"acc_norm_stderr\": 0.01588988836256049\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24277456647398843,\n \"acc_stderr\": 0.0230836585869842,\n\
\ \"acc_norm\": 0.24277456647398843,\n \"acc_norm_stderr\": 0.0230836585869842\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808868,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808868\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.22875816993464052,\n \"acc_stderr\": 0.024051029739912255,\n\
\ \"acc_norm\": 0.22875816993464052,\n \"acc_norm_stderr\": 0.024051029739912255\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2379421221864952,\n\
\ \"acc_stderr\": 0.024185150647818707,\n \"acc_norm\": 0.2379421221864952,\n\
\ \"acc_norm_stderr\": 0.024185150647818707\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2553191489361702,\n \"acc_stderr\": 0.026011992930902013,\n \
\ \"acc_norm\": 0.2553191489361702,\n \"acc_norm_stderr\": 0.026011992930902013\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24119947848761408,\n\
\ \"acc_stderr\": 0.010926496102034947,\n \"acc_norm\": 0.24119947848761408,\n\
\ \"acc_norm_stderr\": 0.010926496102034947\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.18382352941176472,\n \"acc_stderr\": 0.023529242185193106,\n\
\ \"acc_norm\": 0.18382352941176472,\n \"acc_norm_stderr\": 0.023529242185193106\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24836601307189543,\n \"acc_stderr\": 0.017479487001364764,\n \
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.017479487001364764\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2727272727272727,\n\
\ \"acc_stderr\": 0.04265792110940588,\n \"acc_norm\": 0.2727272727272727,\n\
\ \"acc_norm_stderr\": 0.04265792110940588\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.24897959183673468,\n \"acc_stderr\": 0.02768297952296023,\n\
\ \"acc_norm\": 0.24897959183673468,\n \"acc_norm_stderr\": 0.02768297952296023\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.03036049015401466,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.03036049015401466\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.26900584795321636,\n \"acc_stderr\": 0.0340105262010409,\n\
\ \"acc_norm\": 0.26900584795321636,\n \"acc_norm_stderr\": 0.0340105262010409\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23745410036719705,\n\
\ \"mc1_stderr\": 0.014896277441041836,\n \"mc2\": 0.3861173734844904,\n\
\ \"mc2_stderr\": 0.014144546234841945\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5832675611681136,\n \"acc_stderr\": 0.013856250072796316\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.01592115238817286,\n \
\ \"acc_stderr\": 0.0034478192723889967\n }\n}\n```"
repo_url: https://huggingface.co/OEvortex/HelpingAI-Lite-1.5T
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-17-09.699346.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- '**/details_harness|winogrande|5_2024-03-10T06-17-09.699346.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T06-17-09.699346.parquet'
- config_name: results
data_files:
- split: 2024_03_10T06_17_09.699346
path:
- results_2024-03-10T06-17-09.699346.parquet
- split: latest
path:
- results_2024-03-10T06-17-09.699346.parquet
---
# Dataset Card for Evaluation run of OEvortex/HelpingAI-Lite-1.5T
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OEvortex/HelpingAI-Lite-1.5T](https://huggingface.co/OEvortex/HelpingAI-Lite-1.5T) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T06:17:09.699346](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-1.5T/blob/main/results_2024-03-10T06-17-09.699346.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2635465581409758,
"acc_stderr": 0.031199778547091002,
"acc_norm": 0.26467294429469646,
"acc_norm_stderr": 0.03197040307669128,
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3861173734844904,
"mc2_stderr": 0.014144546234841945
},
"harness|arc:challenge|25": {
"acc": 0.28924914675767915,
"acc_stderr": 0.013250012579393443,
"acc_norm": 0.3122866894197952,
"acc_norm_stderr": 0.013542598541688065
},
"harness|hellaswag|10": {
"acc": 0.40838478390758814,
"acc_stderr": 0.00490530437109087,
"acc_norm": 0.5238996215893248,
"acc_norm_stderr": 0.004984077906216095
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.03915450630414251,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.03915450630414251
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.20394736842105263,
"acc_stderr": 0.032790004063100515,
"acc_norm": 0.20394736842105263,
"acc_norm_stderr": 0.032790004063100515
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2528301886792453,
"acc_stderr": 0.026749899771241238,
"acc_norm": 0.2528301886792453,
"acc_norm_stderr": 0.026749899771241238
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.03396116205845335,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.03396116205845335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.27,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.21965317919075145,
"acc_stderr": 0.031568093627031744,
"acc_norm": 0.21965317919075145,
"acc_norm_stderr": 0.031568093627031744
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.28431372549019607,
"acc_stderr": 0.04488482852329017,
"acc_norm": 0.28431372549019607,
"acc_norm_stderr": 0.04488482852329017
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292326,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292326
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.04049339297748141,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.04049339297748141
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.25517241379310346,
"acc_stderr": 0.03632984052707842,
"acc_norm": 0.25517241379310346,
"acc_norm_stderr": 0.03632984052707842
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.022418042891113942,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.022418042891113942
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.20634920634920634,
"acc_stderr": 0.036196045241242515,
"acc_norm": 0.20634920634920634,
"acc_norm_stderr": 0.036196045241242515
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25161290322580643,
"acc_stderr": 0.024685979286239956,
"acc_norm": 0.25161290322580643,
"acc_norm_stderr": 0.024685979286239956
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233484,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233484
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2787878787878788,
"acc_stderr": 0.03501438706296781,
"acc_norm": 0.2787878787878788,
"acc_norm_stderr": 0.03501438706296781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.23737373737373738,
"acc_stderr": 0.0303137105381989,
"acc_norm": 0.23737373737373738,
"acc_norm_stderr": 0.0303137105381989
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.02977866303775296,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.02977866303775296
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2230769230769231,
"acc_stderr": 0.02110773012724398,
"acc_norm": 0.2230769230769231,
"acc_norm_stderr": 0.02110773012724398
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073835,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073835
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.23109243697478993,
"acc_stderr": 0.027381406927868966,
"acc_norm": 0.23109243697478993,
"acc_norm_stderr": 0.027381406927868966
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2185430463576159,
"acc_stderr": 0.03374235550425694,
"acc_norm": 0.2185430463576159,
"acc_norm_stderr": 0.03374235550425694
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.21834862385321102,
"acc_stderr": 0.017712600528722734,
"acc_norm": 0.21834862385321102,
"acc_norm_stderr": 0.017712600528722734
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.41203703703703703,
"acc_stderr": 0.03356787758160835,
"acc_norm": 0.41203703703703703,
"acc_norm_stderr": 0.03356787758160835
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2107843137254902,
"acc_stderr": 0.028626547912437416,
"acc_norm": 0.2107843137254902,
"acc_norm_stderr": 0.028626547912437416
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2616033755274262,
"acc_stderr": 0.028609516716994934,
"acc_norm": 0.2616033755274262,
"acc_norm_stderr": 0.028609516716994934
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.34977578475336324,
"acc_stderr": 0.03200736719484503,
"acc_norm": 0.34977578475336324,
"acc_norm_stderr": 0.03200736719484503
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2900763358778626,
"acc_stderr": 0.03980066246467765,
"acc_norm": 0.2900763358778626,
"acc_norm_stderr": 0.03980066246467765
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302872,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302872
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.26851851851851855,
"acc_stderr": 0.04284467968052192,
"acc_norm": 0.26851851851851855,
"acc_norm_stderr": 0.04284467968052192
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2822085889570552,
"acc_stderr": 0.03536117886664743,
"acc_norm": 0.2822085889570552,
"acc_norm_stderr": 0.03536117886664743
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.25892857142857145,
"acc_stderr": 0.041577515398656284,
"acc_norm": 0.25892857142857145,
"acc_norm_stderr": 0.041577515398656284
},
"harness|hendrycksTest-management|5": {
"acc": 0.18446601941747573,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.18446601941747573,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.25213675213675213,
"acc_stderr": 0.02844796547623102,
"acc_norm": 0.25213675213675213,
"acc_norm_stderr": 0.02844796547623102
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2707535121328225,
"acc_stderr": 0.01588988836256049,
"acc_norm": 0.2707535121328225,
"acc_norm_stderr": 0.01588988836256049
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24277456647398843,
"acc_stderr": 0.0230836585869842,
"acc_norm": 0.24277456647398843,
"acc_norm_stderr": 0.0230836585869842
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808868,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808868
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.22875816993464052,
"acc_stderr": 0.024051029739912255,
"acc_norm": 0.22875816993464052,
"acc_norm_stderr": 0.024051029739912255
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2379421221864952,
"acc_stderr": 0.024185150647818707,
"acc_norm": 0.2379421221864952,
"acc_norm_stderr": 0.024185150647818707
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2553191489361702,
"acc_stderr": 0.026011992930902013,
"acc_norm": 0.2553191489361702,
"acc_norm_stderr": 0.026011992930902013
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24119947848761408,
"acc_stderr": 0.010926496102034947,
"acc_norm": 0.24119947848761408,
"acc_norm_stderr": 0.010926496102034947
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.18382352941176472,
"acc_stderr": 0.023529242185193106,
"acc_norm": 0.18382352941176472,
"acc_norm_stderr": 0.023529242185193106
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.017479487001364764,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.017479487001364764
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.04265792110940588,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.04265792110940588
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.24897959183673468,
"acc_stderr": 0.02768297952296023,
"acc_norm": 0.24897959183673468,
"acc_norm_stderr": 0.02768297952296023
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.03036049015401466,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.03036049015401466
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.26900584795321636,
"acc_stderr": 0.0340105262010409,
"acc_norm": 0.26900584795321636,
"acc_norm_stderr": 0.0340105262010409
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23745410036719705,
"mc1_stderr": 0.014896277441041836,
"mc2": 0.3861173734844904,
"mc2_stderr": 0.014144546234841945
},
"harness|winogrande|5": {
"acc": 0.5832675611681136,
"acc_stderr": 0.013856250072796316
},
"harness|gsm8k|5": {
"acc": 0.01592115238817286,
"acc_stderr": 0.0034478192723889967
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jeonseonjin/ezdemo | jeonseonjin | "2024-03-10T07:10:56Z" | 0 | 0 | [
"language:en",
"license:apache-2.0",
"size_categories:10K<n<100K",
"format:csv",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T06:25:40Z" | ---
language:
- en
license: apache-2.0
size_categories:
- 1K<n<10K
configs:
- config_name: default
data_files:
- split: train
path: datasets/bodyPerfor_train*
- split: test
path: datasets/bodyPerfor_train*
--- |
open-llm-leaderboard-old/details_OEvortex__HelpingAI-Lite-2x1B | open-llm-leaderboard-old | "2024-03-10T07:00:37Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T07:00:16Z" | ---
pretty_name: Evaluation run of OEvortex/HelpingAI-Lite-2x1B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OEvortex/HelpingAI-Lite-2x1B](https://huggingface.co/OEvortex/HelpingAI-Lite-2x1B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-2x1B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T06:58:17.980002](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-2x1B/blob/main/results_2024-03-10T06-58-17.980002.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2581419283962112,\n\
\ \"acc_stderr\": 0.03079014541871701,\n \"acc_norm\": 0.25895393741265854,\n\
\ \"acc_norm_stderr\": 0.03153048801556658,\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080501,\n \"mc2\": 0.3739387301995819,\n\
\ \"mc2_stderr\": 0.013905491179809788\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.34982935153583616,\n \"acc_stderr\": 0.01393680921215828,\n\
\ \"acc_norm\": 0.3609215017064846,\n \"acc_norm_stderr\": 0.01403476138617546\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4585739892451703,\n\
\ \"acc_stderr\": 0.0049726258487026494,\n \"acc_norm\": 0.6111332403903604,\n\
\ \"acc_norm_stderr\": 0.0048649667923107\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.0416333199893227\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.15555555555555556,\n\
\ \"acc_stderr\": 0.03130948364878315,\n \"acc_norm\": 0.15555555555555556,\n\
\ \"acc_norm_stderr\": 0.03130948364878315\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.19078947368421054,\n \"acc_stderr\": 0.03197565821032499,\n\
\ \"acc_norm\": 0.19078947368421054,\n \"acc_norm_stderr\": 0.03197565821032499\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.04292346959909283,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.04292346959909283\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\": 0.27,\n\
\ \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421296,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421296\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.19653179190751446,\n\
\ \"acc_stderr\": 0.03029957466478815,\n \"acc_norm\": 0.19653179190751446,\n\
\ \"acc_norm_stderr\": 0.03029957466478815\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.21568627450980393,\n \"acc_stderr\": 0.04092563958237656,\n\
\ \"acc_norm\": 0.21568627450980393,\n \"acc_norm_stderr\": 0.04092563958237656\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.24,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.25957446808510637,\n \"acc_stderr\": 0.028659179374292323,\n\
\ \"acc_norm\": 0.25957446808510637,\n \"acc_norm_stderr\": 0.028659179374292323\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.22807017543859648,\n\
\ \"acc_stderr\": 0.03947152782669415,\n \"acc_norm\": 0.22807017543859648,\n\
\ \"acc_norm_stderr\": 0.03947152782669415\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2482758620689655,\n \"acc_stderr\": 0.03600105692727771,\n\
\ \"acc_norm\": 0.2482758620689655,\n \"acc_norm_stderr\": 0.03600105692727771\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.21428571428571427,\n\
\ \"acc_stderr\": 0.03670066451047181,\n \"acc_norm\": 0.21428571428571427,\n\
\ \"acc_norm_stderr\": 0.03670066451047181\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.22903225806451613,\n\
\ \"acc_stderr\": 0.02390491431178265,\n \"acc_norm\": 0.22903225806451613,\n\
\ \"acc_norm_stderr\": 0.02390491431178265\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.2315270935960591,\n \"acc_stderr\": 0.029678333141444437,\n\
\ \"acc_norm\": 0.2315270935960591,\n \"acc_norm_stderr\": 0.029678333141444437\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.22,\n \"acc_stderr\": 0.041633319989322695,\n \"acc_norm\"\
: 0.22,\n \"acc_norm_stderr\": 0.041633319989322695\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.23030303030303031,\n \"acc_stderr\": 0.03287666758603489,\n\
\ \"acc_norm\": 0.23030303030303031,\n \"acc_norm_stderr\": 0.03287666758603489\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.2222222222222222,\n \"acc_stderr\": 0.029620227874790465,\n \"\
acc_norm\": 0.2222222222222222,\n \"acc_norm_stderr\": 0.029620227874790465\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.22797927461139897,\n \"acc_stderr\": 0.030276909945178267,\n\
\ \"acc_norm\": 0.22797927461139897,\n \"acc_norm_stderr\": 0.030276909945178267\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2564102564102564,\n \"acc_stderr\": 0.022139081103971534,\n\
\ \"acc_norm\": 0.2564102564102564,\n \"acc_norm_stderr\": 0.022139081103971534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26666666666666666,\n \"acc_stderr\": 0.026962424325073838,\n \
\ \"acc_norm\": 0.26666666666666666,\n \"acc_norm_stderr\": 0.026962424325073838\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24789915966386555,\n \"acc_stderr\": 0.028047967224176892,\n\
\ \"acc_norm\": 0.24789915966386555,\n \"acc_norm_stderr\": 0.028047967224176892\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.19205298013245034,\n \"acc_stderr\": 0.03216298420593614,\n \"\
acc_norm\": 0.19205298013245034,\n \"acc_norm_stderr\": 0.03216298420593614\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24587155963302754,\n \"acc_stderr\": 0.018461940968708446,\n \"\
acc_norm\": 0.24587155963302754,\n \"acc_norm_stderr\": 0.018461940968708446\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.38425925925925924,\n \"acc_stderr\": 0.03317354514310742,\n \"\
acc_norm\": 0.38425925925925924,\n \"acc_norm_stderr\": 0.03317354514310742\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2696078431372549,\n \"acc_stderr\": 0.031145570659486782,\n \"\
acc_norm\": 0.2696078431372549,\n \"acc_norm_stderr\": 0.031145570659486782\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598025,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598025\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.032190792004199956,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.032190792004199956\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.25190839694656486,\n \"acc_stderr\": 0.03807387116306086,\n\
\ \"acc_norm\": 0.25190839694656486,\n \"acc_norm_stderr\": 0.03807387116306086\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.256198347107438,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.256198347107438,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2331288343558282,\n \"acc_stderr\": 0.03322015795776741,\n\
\ \"acc_norm\": 0.2331288343558282,\n \"acc_norm_stderr\": 0.03322015795776741\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.29464285714285715,\n\
\ \"acc_stderr\": 0.04327040932578728,\n \"acc_norm\": 0.29464285714285715,\n\
\ \"acc_norm_stderr\": 0.04327040932578728\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.23300970873786409,\n \"acc_stderr\": 0.041858325989283164,\n\
\ \"acc_norm\": 0.23300970873786409,\n \"acc_norm_stderr\": 0.041858325989283164\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.2948717948717949,\n\
\ \"acc_stderr\": 0.02987257770889115,\n \"acc_norm\": 0.2948717948717949,\n\
\ \"acc_norm_stderr\": 0.02987257770889115\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.04461960433384741,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.04461960433384741\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.28991060025542786,\n\
\ \"acc_stderr\": 0.016225017944770954,\n \"acc_norm\": 0.28991060025542786,\n\
\ \"acc_norm_stderr\": 0.016225017944770954\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2398843930635838,\n \"acc_stderr\": 0.02298959254312357,\n\
\ \"acc_norm\": 0.2398843930635838,\n \"acc_norm_stderr\": 0.02298959254312357\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24022346368715083,\n\
\ \"acc_stderr\": 0.014288343803925302,\n \"acc_norm\": 0.24022346368715083,\n\
\ \"acc_norm_stderr\": 0.014288343803925302\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.26143790849673204,\n \"acc_stderr\": 0.025160998214292456,\n\
\ \"acc_norm\": 0.26143790849673204,\n \"acc_norm_stderr\": 0.025160998214292456\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.26688102893890675,\n\
\ \"acc_stderr\": 0.025122637608816653,\n \"acc_norm\": 0.26688102893890675,\n\
\ \"acc_norm_stderr\": 0.025122637608816653\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.25925925925925924,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.23404255319148937,\n \"acc_stderr\": 0.025257861359432407,\n \
\ \"acc_norm\": 0.23404255319148937,\n \"acc_norm_stderr\": 0.025257861359432407\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23402868318122555,\n\
\ \"acc_stderr\": 0.010813585552659691,\n \"acc_norm\": 0.23402868318122555,\n\
\ \"acc_norm_stderr\": 0.010813585552659691\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21323529411764705,\n \"acc_stderr\": 0.024880971512294268,\n\
\ \"acc_norm\": 0.21323529411764705,\n \"acc_norm_stderr\": 0.024880971512294268\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.26633986928104575,\n \"acc_stderr\": 0.017883188134667195,\n \
\ \"acc_norm\": 0.26633986928104575,\n \"acc_norm_stderr\": 0.017883188134667195\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.04389311454644286,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.04389311454644286\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.1346938775510204,\n \"acc_stderr\": 0.021855658840811615,\n\
\ \"acc_norm\": 0.1346938775510204,\n \"acc_norm_stderr\": 0.021855658840811615\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23880597014925373,\n\
\ \"acc_stderr\": 0.030147775935409224,\n \"acc_norm\": 0.23880597014925373,\n\
\ \"acc_norm_stderr\": 0.030147775935409224\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3132530120481928,\n\
\ \"acc_stderr\": 0.03610805018031024,\n \"acc_norm\": 0.3132530120481928,\n\
\ \"acc_norm_stderr\": 0.03610805018031024\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.03565079670708312,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.03565079670708312\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.23255813953488372,\n\
\ \"mc1_stderr\": 0.014789157531080501,\n \"mc2\": 0.3739387301995819,\n\
\ \"mc2_stderr\": 0.013905491179809788\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6085240726124704,\n \"acc_stderr\": 0.013717487071290852\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.02350265352539803,\n \
\ \"acc_stderr\": 0.004172883669643954\n }\n}\n```"
repo_url: https://huggingface.co/OEvortex/HelpingAI-Lite-2x1B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-17.980002.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-58-17.980002.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- '**/details_harness|winogrande|5_2024-03-10T06-58-17.980002.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T06-58-17.980002.parquet'
- config_name: results
data_files:
- split: 2024_03_10T06_58_17.980002
path:
- results_2024-03-10T06-58-17.980002.parquet
- split: latest
path:
- results_2024-03-10T06-58-17.980002.parquet
---
# Dataset Card for Evaluation run of OEvortex/HelpingAI-Lite-2x1B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OEvortex/HelpingAI-Lite-2x1B](https://huggingface.co/OEvortex/HelpingAI-Lite-2x1B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-2x1B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T06:58:17.980002](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-Lite-2x1B/blob/main/results_2024-03-10T06-58-17.980002.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2581419283962112,
"acc_stderr": 0.03079014541871701,
"acc_norm": 0.25895393741265854,
"acc_norm_stderr": 0.03153048801556658,
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080501,
"mc2": 0.3739387301995819,
"mc2_stderr": 0.013905491179809788
},
"harness|arc:challenge|25": {
"acc": 0.34982935153583616,
"acc_stderr": 0.01393680921215828,
"acc_norm": 0.3609215017064846,
"acc_norm_stderr": 0.01403476138617546
},
"harness|hellaswag|10": {
"acc": 0.4585739892451703,
"acc_stderr": 0.0049726258487026494,
"acc_norm": 0.6111332403903604,
"acc_norm_stderr": 0.0048649667923107
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.15555555555555556,
"acc_stderr": 0.03130948364878315,
"acc_norm": 0.15555555555555556,
"acc_norm_stderr": 0.03130948364878315
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.19078947368421054,
"acc_stderr": 0.03197565821032499,
"acc_norm": 0.19078947368421054,
"acc_norm_stderr": 0.03197565821032499
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.04292346959909283,
"acc_norm": 0.24,
"acc_norm_stderr": 0.04292346959909283
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421296,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421296
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.19653179190751446,
"acc_stderr": 0.03029957466478815,
"acc_norm": 0.19653179190751446,
"acc_norm_stderr": 0.03029957466478815
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.21568627450980393,
"acc_stderr": 0.04092563958237656,
"acc_norm": 0.21568627450980393,
"acc_norm_stderr": 0.04092563958237656
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.25957446808510637,
"acc_stderr": 0.028659179374292323,
"acc_norm": 0.25957446808510637,
"acc_norm_stderr": 0.028659179374292323
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.03947152782669415,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.03947152782669415
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2482758620689655,
"acc_stderr": 0.03600105692727771,
"acc_norm": 0.2482758620689655,
"acc_norm_stderr": 0.03600105692727771
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.21428571428571427,
"acc_stderr": 0.03670066451047181,
"acc_norm": 0.21428571428571427,
"acc_norm_stderr": 0.03670066451047181
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.22903225806451613,
"acc_stderr": 0.02390491431178265,
"acc_norm": 0.22903225806451613,
"acc_norm_stderr": 0.02390491431178265
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2315270935960591,
"acc_stderr": 0.029678333141444437,
"acc_norm": 0.2315270935960591,
"acc_norm_stderr": 0.029678333141444437
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.041633319989322695,
"acc_norm": 0.22,
"acc_norm_stderr": 0.041633319989322695
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.23030303030303031,
"acc_stderr": 0.03287666758603489,
"acc_norm": 0.23030303030303031,
"acc_norm_stderr": 0.03287666758603489
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.029620227874790465,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.029620227874790465
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.22797927461139897,
"acc_stderr": 0.030276909945178267,
"acc_norm": 0.22797927461139897,
"acc_norm_stderr": 0.030276909945178267
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2564102564102564,
"acc_stderr": 0.022139081103971534,
"acc_norm": 0.2564102564102564,
"acc_norm_stderr": 0.022139081103971534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26666666666666666,
"acc_stderr": 0.026962424325073838,
"acc_norm": 0.26666666666666666,
"acc_norm_stderr": 0.026962424325073838
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24789915966386555,
"acc_stderr": 0.028047967224176892,
"acc_norm": 0.24789915966386555,
"acc_norm_stderr": 0.028047967224176892
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.19205298013245034,
"acc_stderr": 0.03216298420593614,
"acc_norm": 0.19205298013245034,
"acc_norm_stderr": 0.03216298420593614
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24587155963302754,
"acc_stderr": 0.018461940968708446,
"acc_norm": 0.24587155963302754,
"acc_norm_stderr": 0.018461940968708446
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.38425925925925924,
"acc_stderr": 0.03317354514310742,
"acc_norm": 0.38425925925925924,
"acc_norm_stderr": 0.03317354514310742
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2696078431372549,
"acc_stderr": 0.031145570659486782,
"acc_norm": 0.2696078431372549,
"acc_norm_stderr": 0.031145570659486782
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598025,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598025
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.032190792004199956,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.032190792004199956
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.25190839694656486,
"acc_stderr": 0.03807387116306086,
"acc_norm": 0.25190839694656486,
"acc_norm_stderr": 0.03807387116306086
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.256198347107438,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.256198347107438,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2331288343558282,
"acc_stderr": 0.03322015795776741,
"acc_norm": 0.2331288343558282,
"acc_norm_stderr": 0.03322015795776741
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.29464285714285715,
"acc_stderr": 0.04327040932578728,
"acc_norm": 0.29464285714285715,
"acc_norm_stderr": 0.04327040932578728
},
"harness|hendrycksTest-management|5": {
"acc": 0.23300970873786409,
"acc_stderr": 0.041858325989283164,
"acc_norm": 0.23300970873786409,
"acc_norm_stderr": 0.041858325989283164
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.2948717948717949,
"acc_stderr": 0.02987257770889115,
"acc_norm": 0.2948717948717949,
"acc_norm_stderr": 0.02987257770889115
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.27,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.27,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.28991060025542786,
"acc_stderr": 0.016225017944770954,
"acc_norm": 0.28991060025542786,
"acc_norm_stderr": 0.016225017944770954
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2398843930635838,
"acc_stderr": 0.02298959254312357,
"acc_norm": 0.2398843930635838,
"acc_norm_stderr": 0.02298959254312357
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24022346368715083,
"acc_stderr": 0.014288343803925302,
"acc_norm": 0.24022346368715083,
"acc_norm_stderr": 0.014288343803925302
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.26143790849673204,
"acc_stderr": 0.025160998214292456,
"acc_norm": 0.26143790849673204,
"acc_norm_stderr": 0.025160998214292456
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.26688102893890675,
"acc_stderr": 0.025122637608816653,
"acc_norm": 0.26688102893890675,
"acc_norm_stderr": 0.025122637608816653
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.23404255319148937,
"acc_stderr": 0.025257861359432407,
"acc_norm": 0.23404255319148937,
"acc_norm_stderr": 0.025257861359432407
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23402868318122555,
"acc_stderr": 0.010813585552659691,
"acc_norm": 0.23402868318122555,
"acc_norm_stderr": 0.010813585552659691
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21323529411764705,
"acc_stderr": 0.024880971512294268,
"acc_norm": 0.21323529411764705,
"acc_norm_stderr": 0.024880971512294268
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.26633986928104575,
"acc_stderr": 0.017883188134667195,
"acc_norm": 0.26633986928104575,
"acc_norm_stderr": 0.017883188134667195
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3,
"acc_stderr": 0.04389311454644286,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04389311454644286
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.1346938775510204,
"acc_stderr": 0.021855658840811615,
"acc_norm": 0.1346938775510204,
"acc_norm_stderr": 0.021855658840811615
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23880597014925373,
"acc_stderr": 0.030147775935409224,
"acc_norm": 0.23880597014925373,
"acc_norm_stderr": 0.030147775935409224
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3132530120481928,
"acc_stderr": 0.03610805018031024,
"acc_norm": 0.3132530120481928,
"acc_norm_stderr": 0.03610805018031024
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.03565079670708312,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.03565079670708312
},
"harness|truthfulqa:mc|0": {
"mc1": 0.23255813953488372,
"mc1_stderr": 0.014789157531080501,
"mc2": 0.3739387301995819,
"mc2_stderr": 0.013905491179809788
},
"harness|winogrande|5": {
"acc": 0.6085240726124704,
"acc_stderr": 0.013717487071290852
},
"harness|gsm8k|5": {
"acc": 0.02350265352539803,
"acc_stderr": 0.004172883669643954
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_TeeZee__GALAXY-XB-v.03 | open-llm-leaderboard-old | "2024-03-10T07:01:03Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T07:00:42Z" | ---
pretty_name: Evaluation run of TeeZee/GALAXY-XB-v.03
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/GALAXY-XB-v.03](https://huggingface.co/TeeZee/GALAXY-XB-v.03) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.03\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T06:58:23.693550](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.03/blob/main/results_2024-03-10T06-58-23.693550.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6441496168781581,\n\
\ \"acc_stderr\": 0.03191652320038853,\n \"acc_norm\": 0.6482826338340464,\n\
\ \"acc_norm_stderr\": 0.03255838951890377,\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.4418731708213551,\n\
\ \"mc2_stderr\": 0.014382809652080579\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5895904436860068,\n \"acc_stderr\": 0.014374922192642666,\n\
\ \"acc_norm\": 0.6177474402730375,\n \"acc_norm_stderr\": 0.014200454049979275\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6475801633140809,\n\
\ \"acc_stderr\": 0.004767475366689761,\n \"acc_norm\": 0.8358892650866361,\n\
\ \"acc_norm_stderr\": 0.003696190832547414\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5777777777777777,\n\
\ \"acc_stderr\": 0.04266763404099582,\n \"acc_norm\": 0.5777777777777777,\n\
\ \"acc_norm_stderr\": 0.04266763404099582\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.03523807393012047,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.03523807393012047\n \
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6981132075471698,\n \"acc_stderr\": 0.02825420034443866,\n\
\ \"acc_norm\": 0.6981132075471698,\n \"acc_norm_stderr\": 0.02825420034443866\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.04975698519562428,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.04975698519562428\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \"acc_norm\": 0.5,\n\
\ \"acc_norm_stderr\": 0.050251890762960605\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6820809248554913,\n\
\ \"acc_stderr\": 0.0355068398916558,\n \"acc_norm\": 0.6820809248554913,\n\
\ \"acc_norm_stderr\": 0.0355068398916558\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.047840607041056527,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.047840607041056527\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419035,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419035\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.43386243386243384,\n \"acc_stderr\": 0.0255250343824749,\n \"\
acc_norm\": 0.43386243386243384,\n \"acc_norm_stderr\": 0.0255250343824749\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8161290322580645,\n\
\ \"acc_stderr\": 0.02203721734026784,\n \"acc_norm\": 0.8161290322580645,\n\
\ \"acc_norm_stderr\": 0.02203721734026784\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4729064039408867,\n \"acc_stderr\": 0.03512819077876106,\n\
\ \"acc_norm\": 0.4729064039408867,\n \"acc_norm_stderr\": 0.03512819077876106\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.68,\n \"acc_stderr\": 0.04688261722621504,\n \"acc_norm\"\
: 0.68,\n \"acc_norm_stderr\": 0.04688261722621504\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8181818181818182,\n \"acc_stderr\": 0.027479603010538804,\n \"\
acc_norm\": 0.8181818181818182,\n \"acc_norm_stderr\": 0.027479603010538804\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.021995311963644234,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.021995311963644234\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6487179487179487,\n \"acc_stderr\": 0.024203665177902803,\n\
\ \"acc_norm\": 0.6487179487179487,\n \"acc_norm_stderr\": 0.024203665177902803\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131143,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131143\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6596638655462185,\n \"acc_stderr\": 0.030778057422931673,\n\
\ \"acc_norm\": 0.6596638655462185,\n \"acc_norm_stderr\": 0.030778057422931673\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8311926605504587,\n \"acc_stderr\": 0.016060056268530312,\n \"\
acc_norm\": 0.8311926605504587,\n \"acc_norm_stderr\": 0.016060056268530312\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.6064814814814815,\n \"acc_stderr\": 0.03331747876370312,\n \"\
acc_norm\": 0.6064814814814815,\n \"acc_norm_stderr\": 0.03331747876370312\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579647,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579647\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229146,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229146\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.036401182719909456,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.036401182719909456\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.44642857142857145,\n\
\ \"acc_stderr\": 0.04718471485219588,\n \"acc_norm\": 0.44642857142857145,\n\
\ \"acc_norm_stderr\": 0.04718471485219588\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8237547892720306,\n\
\ \"acc_stderr\": 0.013625556907993459,\n \"acc_norm\": 0.8237547892720306,\n\
\ \"acc_norm_stderr\": 0.013625556907993459\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.02353292543104429,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.02353292543104429\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2581005586592179,\n\
\ \"acc_stderr\": 0.014635185616527836,\n \"acc_norm\": 0.2581005586592179,\n\
\ \"acc_norm_stderr\": 0.014635185616527836\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.761437908496732,\n \"acc_stderr\": 0.02440439492808787,\n\
\ \"acc_norm\": 0.761437908496732,\n \"acc_norm_stderr\": 0.02440439492808787\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.02592237178881877,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.02592237178881877\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48565840938722293,\n\
\ \"acc_stderr\": 0.012764981829524269,\n \"acc_norm\": 0.48565840938722293,\n\
\ \"acc_norm_stderr\": 0.012764981829524269\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6895424836601307,\n \"acc_stderr\": 0.01871806705262322,\n \
\ \"acc_norm\": 0.6895424836601307,\n \"acc_norm_stderr\": 0.01871806705262322\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712845,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712845\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8656716417910447,\n\
\ \"acc_stderr\": 0.024112678240900798,\n \"acc_norm\": 0.8656716417910447,\n\
\ \"acc_norm_stderr\": 0.024112678240900798\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.9,\n \"acc_stderr\": 0.03015113445777634,\n \
\ \"acc_norm\": 0.9,\n \"acc_norm_stderr\": 0.03015113445777634\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2913096695226438,\n\
\ \"mc1_stderr\": 0.01590598704818483,\n \"mc2\": 0.4418731708213551,\n\
\ \"mc2_stderr\": 0.014382809652080579\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8105761641673244,\n \"acc_stderr\": 0.011012790432989243\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.45034116755117515,\n \
\ \"acc_stderr\": 0.013704390498582816\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/GALAXY-XB-v.03
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-23.693550.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T06-58-23.693550.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- '**/details_harness|winogrande|5_2024-03-10T06-58-23.693550.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T06-58-23.693550.parquet'
- config_name: results
data_files:
- split: 2024_03_10T06_58_23.693550
path:
- results_2024-03-10T06-58-23.693550.parquet
- split: latest
path:
- results_2024-03-10T06-58-23.693550.parquet
---
# Dataset Card for Evaluation run of TeeZee/GALAXY-XB-v.03
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/GALAXY-XB-v.03](https://huggingface.co/TeeZee/GALAXY-XB-v.03) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.03",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T06:58:23.693550](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.03/blob/main/results_2024-03-10T06-58-23.693550.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6441496168781581,
"acc_stderr": 0.03191652320038853,
"acc_norm": 0.6482826338340464,
"acc_norm_stderr": 0.03255838951890377,
"mc1": 0.2913096695226438,
"mc1_stderr": 0.01590598704818483,
"mc2": 0.4418731708213551,
"mc2_stderr": 0.014382809652080579
},
"harness|arc:challenge|25": {
"acc": 0.5895904436860068,
"acc_stderr": 0.014374922192642666,
"acc_norm": 0.6177474402730375,
"acc_norm_stderr": 0.014200454049979275
},
"harness|hellaswag|10": {
"acc": 0.6475801633140809,
"acc_stderr": 0.004767475366689761,
"acc_norm": 0.8358892650866361,
"acc_norm_stderr": 0.003696190832547414
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5777777777777777,
"acc_stderr": 0.04266763404099582,
"acc_norm": 0.5777777777777777,
"acc_norm_stderr": 0.04266763404099582
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.75,
"acc_stderr": 0.03523807393012047,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03523807393012047
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6981132075471698,
"acc_stderr": 0.02825420034443866,
"acc_norm": 0.6981132075471698,
"acc_norm_stderr": 0.02825420034443866
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.43,
"acc_stderr": 0.04975698519562428,
"acc_norm": 0.43,
"acc_norm_stderr": 0.04975698519562428
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6820809248554913,
"acc_stderr": 0.0355068398916558,
"acc_norm": 0.6820809248554913,
"acc_norm_stderr": 0.0355068398916558
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.047840607041056527,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.047840607041056527
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.79,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419035,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419035
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.43386243386243384,
"acc_stderr": 0.0255250343824749,
"acc_norm": 0.43386243386243384,
"acc_norm_stderr": 0.0255250343824749
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8161290322580645,
"acc_stderr": 0.02203721734026784,
"acc_norm": 0.8161290322580645,
"acc_norm_stderr": 0.02203721734026784
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4729064039408867,
"acc_stderr": 0.03512819077876106,
"acc_norm": 0.4729064039408867,
"acc_norm_stderr": 0.03512819077876106
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.68,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.68,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8181818181818182,
"acc_stderr": 0.027479603010538804,
"acc_norm": 0.8181818181818182,
"acc_norm_stderr": 0.027479603010538804
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.021995311963644234,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.021995311963644234
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6487179487179487,
"acc_stderr": 0.024203665177902803,
"acc_norm": 0.6487179487179487,
"acc_norm_stderr": 0.024203665177902803
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131143,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131143
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6596638655462185,
"acc_stderr": 0.030778057422931673,
"acc_norm": 0.6596638655462185,
"acc_norm_stderr": 0.030778057422931673
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8311926605504587,
"acc_stderr": 0.016060056268530312,
"acc_norm": 0.8311926605504587,
"acc_norm_stderr": 0.016060056268530312
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.6064814814814815,
"acc_stderr": 0.03331747876370312,
"acc_norm": 0.6064814814814815,
"acc_norm_stderr": 0.03331747876370312
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579647,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579647
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229146,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229146
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.036401182719909456,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.036401182719909456
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.44642857142857145,
"acc_stderr": 0.04718471485219588,
"acc_norm": 0.44642857142857145,
"acc_norm_stderr": 0.04718471485219588
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8237547892720306,
"acc_stderr": 0.013625556907993459,
"acc_norm": 0.8237547892720306,
"acc_norm_stderr": 0.013625556907993459
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.02353292543104429,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.02353292543104429
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2581005586592179,
"acc_stderr": 0.014635185616527836,
"acc_norm": 0.2581005586592179,
"acc_norm_stderr": 0.014635185616527836
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.761437908496732,
"acc_stderr": 0.02440439492808787,
"acc_norm": 0.761437908496732,
"acc_norm_stderr": 0.02440439492808787
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.02592237178881877,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.02592237178881877
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48565840938722293,
"acc_stderr": 0.012764981829524269,
"acc_norm": 0.48565840938722293,
"acc_norm_stderr": 0.012764981829524269
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6895424836601307,
"acc_stderr": 0.01871806705262322,
"acc_norm": 0.6895424836601307,
"acc_norm_stderr": 0.01871806705262322
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712845,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712845
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8656716417910447,
"acc_stderr": 0.024112678240900798,
"acc_norm": 0.8656716417910447,
"acc_norm_stderr": 0.024112678240900798
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.9,
"acc_stderr": 0.03015113445777634,
"acc_norm": 0.9,
"acc_norm_stderr": 0.03015113445777634
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2913096695226438,
"mc1_stderr": 0.01590598704818483,
"mc2": 0.4418731708213551,
"mc2_stderr": 0.014382809652080579
},
"harness|winogrande|5": {
"acc": 0.8105761641673244,
"acc_stderr": 0.011012790432989243
},
"harness|gsm8k|5": {
"acc": 0.45034116755117515,
"acc_stderr": 0.013704390498582816
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
thanhnamitit/alpaca_translate_GPT_35_10_20k | thanhnamitit | "2024-03-10T07:18:58Z" | 0 | 0 | [
"license:unknown",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T07:11:12Z" | ---
license: unknown
---
|
FireNutter/Voices | FireNutter | "2024-04-13T15:56:35Z" | 0 | 0 | [
"license:creativeml-openrail-m",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T07:11:40Z" | ---
license: creativeml-openrail-m
---
|
open-llm-leaderboard-old/details_RaoFoundation__774M-03_09_2024 | open-llm-leaderboard-old | "2024-03-10T07:32:33Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T07:12:32Z" | ---
pretty_name: Evaluation run of RaoFoundation/774M-03_09_2024
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [RaoFoundation/774M-03_09_2024](https://huggingface.co/RaoFoundation/774M-03_09_2024)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T07:31:11.420594](https://huggingface.co/datasets/open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024/blob/main/results_2024-03-10T07-31-11.420594.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25714164108824067,\n\
\ \"acc_stderr\": 0.03086305887436439,\n \"acc_norm\": 0.2589960153068607,\n\
\ \"acc_norm_stderr\": 0.03165311681557453,\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059686,\n \"mc2\": 0.3444347337952659,\n\
\ \"mc2_stderr\": 0.013606216674916146\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2790102389078498,\n \"acc_stderr\": 0.013106784883601345,\n\
\ \"acc_norm\": 0.302901023890785,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.41366261700856405,\n\
\ \"acc_stderr\": 0.00491482938498347,\n \"acc_norm\": 0.5388368850826528,\n\
\ \"acc_norm_stderr\": 0.0049747064284342765\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.2740740740740741,\n\
\ \"acc_stderr\": 0.03853254836552003,\n \"acc_norm\": 0.2740740740740741,\n\
\ \"acc_norm_stderr\": 0.03853254836552003\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123394,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123394\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.026055296901152915,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.026055296901152915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2638888888888889,\n\
\ \"acc_stderr\": 0.03685651095897532,\n \"acc_norm\": 0.2638888888888889,\n\
\ \"acc_norm_stderr\": 0.03685651095897532\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536955,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536955\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749912,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749912\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.23529411764705882,\n \"acc_stderr\": 0.04220773659171453,\n\
\ \"acc_norm\": 0.23529411764705882,\n \"acc_norm_stderr\": 0.04220773659171453\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \"acc_norm\": 0.33,\n\
\ \"acc_norm_stderr\": 0.047258156262526045\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2765957446808511,\n \"acc_stderr\": 0.029241883869628834,\n\
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.029241883869628834\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.0414243971948936,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.0414243971948936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.33793103448275863,\n \"acc_stderr\": 0.039417076320648906,\n\
\ \"acc_norm\": 0.33793103448275863,\n \"acc_norm_stderr\": 0.039417076320648906\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.20899470899470898,\n \"acc_stderr\": 0.02094048156533486,\n \"\
acc_norm\": 0.20899470899470898,\n \"acc_norm_stderr\": 0.02094048156533486\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.1746031746031746,\n\
\ \"acc_stderr\": 0.0339549002085611,\n \"acc_norm\": 0.1746031746031746,\n\
\ \"acc_norm_stderr\": 0.0339549002085611\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.25483870967741934,\n \"acc_stderr\": 0.024790118459332204,\n \"\
acc_norm\": 0.25483870967741934,\n \"acc_norm_stderr\": 0.024790118459332204\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.2660098522167488,\n \"acc_stderr\": 0.03108982600293752,\n \"\
acc_norm\": 0.2660098522167488,\n \"acc_norm_stderr\": 0.03108982600293752\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2,\n \"acc_stderr\": 0.031234752377721175,\n \
\ \"acc_norm\": 0.2,\n \"acc_norm_stderr\": 0.031234752377721175\n \
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.25252525252525254,\n \"acc_stderr\": 0.030954055470365897,\n \"\
acc_norm\": 0.25252525252525254,\n \"acc_norm_stderr\": 0.030954055470365897\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.21761658031088082,\n \"acc_stderr\": 0.029778663037752943,\n\
\ \"acc_norm\": 0.21761658031088082,\n \"acc_norm_stderr\": 0.029778663037752943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.25384615384615383,\n \"acc_stderr\": 0.022066054378726257,\n\
\ \"acc_norm\": 0.25384615384615383,\n \"acc_norm_stderr\": 0.022066054378726257\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.26296296296296295,\n \"acc_stderr\": 0.02684205787383371,\n \
\ \"acc_norm\": 0.26296296296296295,\n \"acc_norm_stderr\": 0.02684205787383371\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.19327731092436976,\n \"acc_stderr\": 0.025649470265889183,\n\
\ \"acc_norm\": 0.19327731092436976,\n \"acc_norm_stderr\": 0.025649470265889183\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.036030385453603826,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.036030385453603826\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.22568807339449543,\n \"acc_stderr\": 0.017923087667803053,\n \"\
acc_norm\": 0.22568807339449543,\n \"acc_norm_stderr\": 0.017923087667803053\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.18518518518518517,\n \"acc_stderr\": 0.026491914727355168,\n \"\
acc_norm\": 0.18518518518518517,\n \"acc_norm_stderr\": 0.026491914727355168\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.22058823529411764,\n \"acc_stderr\": 0.029102254389674082,\n \"\
acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.029102254389674082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.3080168776371308,\n \"acc_stderr\": 0.030052389335605695,\n \
\ \"acc_norm\": 0.3080168776371308,\n \"acc_norm_stderr\": 0.030052389335605695\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.336322869955157,\n\
\ \"acc_stderr\": 0.03170882426845501,\n \"acc_norm\": 0.336322869955157,\n\
\ \"acc_norm_stderr\": 0.03170882426845501\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.21374045801526717,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.21374045801526717,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.2809917355371901,\n \"acc_stderr\": 0.04103203830514512,\n \"\
acc_norm\": 0.2809917355371901,\n \"acc_norm_stderr\": 0.04103203830514512\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.25925925925925924,\n\
\ \"acc_stderr\": 0.04236511258094633,\n \"acc_norm\": 0.25925925925925924,\n\
\ \"acc_norm_stderr\": 0.04236511258094633\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2392638036809816,\n \"acc_stderr\": 0.03351953879521269,\n\
\ \"acc_norm\": 0.2392638036809816,\n \"acc_norm_stderr\": 0.03351953879521269\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.04246624336697625,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.04246624336697625\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.22330097087378642,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.22330097087378642,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.26495726495726496,\n\
\ \"acc_stderr\": 0.02891120880274949,\n \"acc_norm\": 0.26495726495726496,\n\
\ \"acc_norm_stderr\": 0.02891120880274949\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.27458492975734355,\n\
\ \"acc_stderr\": 0.015959829933084046,\n \"acc_norm\": 0.27458492975734355,\n\
\ \"acc_norm_stderr\": 0.015959829933084046\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24855491329479767,\n \"acc_stderr\": 0.023267528432100174,\n\
\ \"acc_norm\": 0.24855491329479767,\n \"acc_norm_stderr\": 0.023267528432100174\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24692737430167597,\n\
\ \"acc_stderr\": 0.014422292204808842,\n \"acc_norm\": 0.24692737430167597,\n\
\ \"acc_norm_stderr\": 0.014422292204808842\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.2581699346405229,\n \"acc_stderr\": 0.025058503316958167,\n\
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.025058503316958167\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2604501607717042,\n\
\ \"acc_stderr\": 0.02492672322484553,\n \"acc_norm\": 0.2604501607717042,\n\
\ \"acc_norm_stderr\": 0.02492672322484553\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.02551873104953778,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.02551873104953778\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.24837027379400262,\n\
\ \"acc_stderr\": 0.011035212598034517,\n \"acc_norm\": 0.24837027379400262,\n\
\ \"acc_norm_stderr\": 0.011035212598034517\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.34558823529411764,\n \"acc_stderr\": 0.02888819310398864,\n\
\ \"acc_norm\": 0.34558823529411764,\n \"acc_norm_stderr\": 0.02888819310398864\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.24183006535947713,\n \"acc_stderr\": 0.017322789207784326,\n \
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.017322789207784326\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2909090909090909,\n\
\ \"acc_stderr\": 0.04350271442923243,\n \"acc_norm\": 0.2909090909090909,\n\
\ \"acc_norm_stderr\": 0.04350271442923243\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.17959183673469387,\n \"acc_stderr\": 0.024573293589585637,\n\
\ \"acc_norm\": 0.17959183673469387,\n \"acc_norm_stderr\": 0.024573293589585637\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.23383084577114427,\n\
\ \"acc_stderr\": 0.029929415408348398,\n \"acc_norm\": 0.23383084577114427,\n\
\ \"acc_norm_stderr\": 0.029929415408348398\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3192771084337349,\n\
\ \"acc_stderr\": 0.0362933532994786,\n \"acc_norm\": 0.3192771084337349,\n\
\ \"acc_norm_stderr\": 0.0362933532994786\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.2631578947368421,\n \"acc_stderr\": 0.03377310252209196,\n\
\ \"acc_norm\": 0.2631578947368421,\n \"acc_norm_stderr\": 0.03377310252209196\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.21297429620563035,\n\
\ \"mc1_stderr\": 0.014332203787059686,\n \"mc2\": 0.3444347337952659,\n\
\ \"mc2_stderr\": 0.013606216674916146\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5509076558800315,\n \"acc_stderr\": 0.01397945938914085\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.003032600454890068,\n \
\ \"acc_stderr\": 0.0015145735612245486\n }\n}\n```"
repo_url: https://huggingface.co/RaoFoundation/774M-03_09_2024
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-11-12.882374.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-31-11.420594.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- '**/details_harness|winogrande|5_2024-03-10T07-11-12.882374.parquet'
- split: 2024_03_10T07_31_11.420594
path:
- '**/details_harness|winogrande|5_2024-03-10T07-31-11.420594.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T07-31-11.420594.parquet'
- config_name: results
data_files:
- split: 2024_03_10T07_11_12.882374
path:
- results_2024-03-10T07-11-12.882374.parquet
- split: 2024_03_10T07_31_11.420594
path:
- results_2024-03-10T07-31-11.420594.parquet
- split: latest
path:
- results_2024-03-10T07-31-11.420594.parquet
---
# Dataset Card for Evaluation run of RaoFoundation/774M-03_09_2024
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [RaoFoundation/774M-03_09_2024](https://huggingface.co/RaoFoundation/774M-03_09_2024) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T07:31:11.420594](https://huggingface.co/datasets/open-llm-leaderboard/details_RaoFoundation__774M-03_09_2024/blob/main/results_2024-03-10T07-31-11.420594.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25714164108824067,
"acc_stderr": 0.03086305887436439,
"acc_norm": 0.2589960153068607,
"acc_norm_stderr": 0.03165311681557453,
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059686,
"mc2": 0.3444347337952659,
"mc2_stderr": 0.013606216674916146
},
"harness|arc:challenge|25": {
"acc": 0.2790102389078498,
"acc_stderr": 0.013106784883601345,
"acc_norm": 0.302901023890785,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.41366261700856405,
"acc_stderr": 0.00491482938498347,
"acc_norm": 0.5388368850826528,
"acc_norm_stderr": 0.0049747064284342765
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.2740740740740741,
"acc_stderr": 0.03853254836552003,
"acc_norm": 0.2740740740740741,
"acc_norm_stderr": 0.03853254836552003
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123394,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123394
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.026055296901152915,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.026055296901152915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2638888888888889,
"acc_stderr": 0.03685651095897532,
"acc_norm": 0.2638888888888889,
"acc_norm_stderr": 0.03685651095897532
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536955,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536955
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749912,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749912
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.23529411764705882,
"acc_stderr": 0.04220773659171453,
"acc_norm": 0.23529411764705882,
"acc_norm_stderr": 0.04220773659171453
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.029241883869628834,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.029241883869628834
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.0414243971948936,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.0414243971948936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.33793103448275863,
"acc_stderr": 0.039417076320648906,
"acc_norm": 0.33793103448275863,
"acc_norm_stderr": 0.039417076320648906
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.20899470899470898,
"acc_stderr": 0.02094048156533486,
"acc_norm": 0.20899470899470898,
"acc_norm_stderr": 0.02094048156533486
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.1746031746031746,
"acc_stderr": 0.0339549002085611,
"acc_norm": 0.1746031746031746,
"acc_norm_stderr": 0.0339549002085611
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.25483870967741934,
"acc_stderr": 0.024790118459332204,
"acc_norm": 0.25483870967741934,
"acc_norm_stderr": 0.024790118459332204
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.2660098522167488,
"acc_stderr": 0.03108982600293752,
"acc_norm": 0.2660098522167488,
"acc_norm_stderr": 0.03108982600293752
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2,
"acc_stderr": 0.031234752377721175,
"acc_norm": 0.2,
"acc_norm_stderr": 0.031234752377721175
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.25252525252525254,
"acc_stderr": 0.030954055470365897,
"acc_norm": 0.25252525252525254,
"acc_norm_stderr": 0.030954055470365897
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.21761658031088082,
"acc_stderr": 0.029778663037752943,
"acc_norm": 0.21761658031088082,
"acc_norm_stderr": 0.029778663037752943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.25384615384615383,
"acc_stderr": 0.022066054378726257,
"acc_norm": 0.25384615384615383,
"acc_norm_stderr": 0.022066054378726257
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.26296296296296295,
"acc_stderr": 0.02684205787383371,
"acc_norm": 0.26296296296296295,
"acc_norm_stderr": 0.02684205787383371
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.19327731092436976,
"acc_stderr": 0.025649470265889183,
"acc_norm": 0.19327731092436976,
"acc_norm_stderr": 0.025649470265889183
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.036030385453603826,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.036030385453603826
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.22568807339449543,
"acc_stderr": 0.017923087667803053,
"acc_norm": 0.22568807339449543,
"acc_norm_stderr": 0.017923087667803053
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.18518518518518517,
"acc_stderr": 0.026491914727355168,
"acc_norm": 0.18518518518518517,
"acc_norm_stderr": 0.026491914727355168
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.029102254389674082,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.029102254389674082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.3080168776371308,
"acc_stderr": 0.030052389335605695,
"acc_norm": 0.3080168776371308,
"acc_norm_stderr": 0.030052389335605695
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.336322869955157,
"acc_stderr": 0.03170882426845501,
"acc_norm": 0.336322869955157,
"acc_norm_stderr": 0.03170882426845501
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.21374045801526717,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.21374045801526717,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.2809917355371901,
"acc_stderr": 0.04103203830514512,
"acc_norm": 0.2809917355371901,
"acc_norm_stderr": 0.04103203830514512
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.04236511258094633,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.04236511258094633
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2392638036809816,
"acc_stderr": 0.03351953879521269,
"acc_norm": 0.2392638036809816,
"acc_norm_stderr": 0.03351953879521269
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.04246624336697625,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.04246624336697625
},
"harness|hendrycksTest-management|5": {
"acc": 0.22330097087378642,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.22330097087378642,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.26495726495726496,
"acc_stderr": 0.02891120880274949,
"acc_norm": 0.26495726495726496,
"acc_norm_stderr": 0.02891120880274949
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.27458492975734355,
"acc_stderr": 0.015959829933084046,
"acc_norm": 0.27458492975734355,
"acc_norm_stderr": 0.015959829933084046
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24855491329479767,
"acc_stderr": 0.023267528432100174,
"acc_norm": 0.24855491329479767,
"acc_norm_stderr": 0.023267528432100174
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24692737430167597,
"acc_stderr": 0.014422292204808842,
"acc_norm": 0.24692737430167597,
"acc_norm_stderr": 0.014422292204808842
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.025058503316958167,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.025058503316958167
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2604501607717042,
"acc_stderr": 0.02492672322484553,
"acc_norm": 0.2604501607717042,
"acc_norm_stderr": 0.02492672322484553
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.02551873104953778,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.02551873104953778
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.24837027379400262,
"acc_stderr": 0.011035212598034517,
"acc_norm": 0.24837027379400262,
"acc_norm_stderr": 0.011035212598034517
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.34558823529411764,
"acc_stderr": 0.02888819310398864,
"acc_norm": 0.34558823529411764,
"acc_norm_stderr": 0.02888819310398864
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.017322789207784326,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.017322789207784326
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2909090909090909,
"acc_stderr": 0.04350271442923243,
"acc_norm": 0.2909090909090909,
"acc_norm_stderr": 0.04350271442923243
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.17959183673469387,
"acc_stderr": 0.024573293589585637,
"acc_norm": 0.17959183673469387,
"acc_norm_stderr": 0.024573293589585637
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.23383084577114427,
"acc_stderr": 0.029929415408348398,
"acc_norm": 0.23383084577114427,
"acc_norm_stderr": 0.029929415408348398
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3192771084337349,
"acc_stderr": 0.0362933532994786,
"acc_norm": 0.3192771084337349,
"acc_norm_stderr": 0.0362933532994786
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.03377310252209196,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.03377310252209196
},
"harness|truthfulqa:mc|0": {
"mc1": 0.21297429620563035,
"mc1_stderr": 0.014332203787059686,
"mc2": 0.3444347337952659,
"mc2_stderr": 0.013606216674916146
},
"harness|winogrande|5": {
"acc": 0.5509076558800315,
"acc_stderr": 0.01397945938914085
},
"harness|gsm8k|5": {
"acc": 0.003032600454890068,
"acc_stderr": 0.0015145735612245486
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
David-Xu/astronomy-stack-dpo-text-20-percent | David-Xu | "2024-03-10T07:16:12Z" | 0 | 1 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T07:16:06Z" | ---
dataset_info:
features:
- name: chosen
dtype: string
- name: rejected
dtype: string
- name: prompt
dtype: string
splits:
- name: train
num_bytes: 9764728
num_examples: 3588
- name: test
num_bytes: 1187244
num_examples: 398
download_size: 3288117
dataset_size: 10951972
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
meenham/MSC_korean | meenham | "2024-03-10T15:26:50Z" | 0 | 0 | [
"task_categories:translation",
"language:ko",
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"translation"
] | "2024-03-10T07:17:15Z" | ---
license: apache-2.0
task_categories:
- translation
language:
- ko
size_categories:
- 1K<n<10K
---
- Data
- source
- MSC data from the paper < Beyond Goldfish Memory: Long-Term Open-Domain Conversation >
- train/valid/test dataset of session 4
- translation ( English -> Koeran )
- GPT-3.5-turbo is used mostly
- GPT-4 : 66 data from the start of session_4_train ( after these, changed to gpt-3.5 ) |
open-llm-leaderboard-old/details_fhai50032__RP-Coder-SM3 | open-llm-leaderboard-old | "2024-03-10T07:24:56Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T07:19:46Z" | ---
pretty_name: Evaluation run of fhai50032/RP-Coder-SM3
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [fhai50032/RP-Coder-SM3](https://huggingface.co/fhai50032/RP-Coder-SM3) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_fhai50032__RP-Coder-SM3\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T07:22:18.963453](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__RP-Coder-SM3/blob/main/results_2024-03-10T07-22-18.963453.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6359839145935902,\n\
\ \"acc_stderr\": 0.03226872295819599,\n \"acc_norm\": 0.6373484968136244,\n\
\ \"acc_norm_stderr\": 0.03292829056146491,\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5410727637055827,\n\
\ \"mc2_stderr\": 0.015496875016507235\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6160409556313993,\n \"acc_stderr\": 0.01421244498065189,\n\
\ \"acc_norm\": 0.6561433447098977,\n \"acc_norm_stderr\": 0.013880644570156213\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6499701254730134,\n\
\ \"acc_stderr\": 0.004760041843651487,\n \"acc_norm\": 0.842162915753834,\n\
\ \"acc_norm_stderr\": 0.003638430620613939\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6,\n \
\ \"acc_stderr\": 0.04232073695151589,\n \"acc_norm\": 0.6,\n \"\
acc_norm_stderr\": 0.04232073695151589\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7039473684210527,\n \"acc_stderr\": 0.03715062154998904,\n\
\ \"acc_norm\": 0.7039473684210527,\n \"acc_norm_stderr\": 0.03715062154998904\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6805555555555556,\n\
\ \"acc_stderr\": 0.03899073687357335,\n \"acc_norm\": 0.6805555555555556,\n\
\ \"acc_norm_stderr\": 0.03899073687357335\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \
\ \"acc_norm\": 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6069364161849711,\n\
\ \"acc_stderr\": 0.03724249595817731,\n \"acc_norm\": 0.6069364161849711,\n\
\ \"acc_norm_stderr\": 0.03724249595817731\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.37254901960784315,\n \"acc_stderr\": 0.048108401480826346,\n\
\ \"acc_norm\": 0.37254901960784315,\n \"acc_norm_stderr\": 0.048108401480826346\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4473684210526316,\n\
\ \"acc_stderr\": 0.04677473004491199,\n \"acc_norm\": 0.4473684210526316,\n\
\ \"acc_norm_stderr\": 0.04677473004491199\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5724137931034483,\n \"acc_stderr\": 0.041227371113703316,\n\
\ \"acc_norm\": 0.5724137931034483,\n \"acc_norm_stderr\": 0.041227371113703316\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.02540255550326091,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.02540255550326091\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7612903225806451,\n\
\ \"acc_stderr\": 0.02425107126220884,\n \"acc_norm\": 0.7612903225806451,\n\
\ \"acc_norm_stderr\": 0.02425107126220884\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5024630541871922,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7878787878787878,\n \"acc_stderr\": 0.03192271569548302,\n\
\ \"acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.03192271569548302\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.803030303030303,\n \"acc_stderr\": 0.028335609732463362,\n \"\
acc_norm\": 0.803030303030303,\n \"acc_norm_stderr\": 0.028335609732463362\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.337037037037037,\n \"acc_stderr\": 0.028820884666253255,\n \
\ \"acc_norm\": 0.337037037037037,\n \"acc_norm_stderr\": 0.028820884666253255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.029953823891887044,\n\
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.029953823891887044\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242742,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242742\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.015848255806501562,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.015848255806501562\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.03408655867977749,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.03408655867977749\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.803921568627451,\n \"acc_stderr\": 0.027865942286639325,\n \"\
acc_norm\": 0.803921568627451,\n \"acc_norm_stderr\": 0.027865942286639325\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7040358744394619,\n\
\ \"acc_stderr\": 0.030636591348699803,\n \"acc_norm\": 0.7040358744394619,\n\
\ \"acc_norm_stderr\": 0.030636591348699803\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7175572519083969,\n \"acc_stderr\": 0.03948406125768362,\n\
\ \"acc_norm\": 0.7175572519083969,\n \"acc_norm_stderr\": 0.03948406125768362\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8264462809917356,\n \"acc_stderr\": 0.03457272836917671,\n \"\
acc_norm\": 0.8264462809917356,\n \"acc_norm_stderr\": 0.03457272836917671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.49107142857142855,\n\
\ \"acc_stderr\": 0.04745033255489123,\n \"acc_norm\": 0.49107142857142855,\n\
\ \"acc_norm_stderr\": 0.04745033255489123\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406953,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406953\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8071519795657727,\n\
\ \"acc_stderr\": 0.014108533515757433,\n \"acc_norm\": 0.8071519795657727,\n\
\ \"acc_norm_stderr\": 0.014108533515757433\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6994219653179191,\n \"acc_stderr\": 0.024685316867257803,\n\
\ \"acc_norm\": 0.6994219653179191,\n \"acc_norm_stderr\": 0.024685316867257803\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.01618544417945717,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.01618544417945717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.025553169991826528,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.025553169991826528\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7234726688102894,\n\
\ \"acc_stderr\": 0.02540383297817961,\n \"acc_norm\": 0.7234726688102894,\n\
\ \"acc_norm_stderr\": 0.02540383297817961\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6975308641975309,\n \"acc_stderr\": 0.02555765398186806,\n\
\ \"acc_norm\": 0.6975308641975309,\n \"acc_norm_stderr\": 0.02555765398186806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.42907801418439717,\n \"acc_stderr\": 0.029525914302558562,\n \
\ \"acc_norm\": 0.42907801418439717,\n \"acc_norm_stderr\": 0.029525914302558562\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4634941329856584,\n\
\ \"acc_stderr\": 0.012736153390214961,\n \"acc_norm\": 0.4634941329856584,\n\
\ \"acc_norm_stderr\": 0.012736153390214961\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6286764705882353,\n \"acc_stderr\": 0.029349803139765873,\n\
\ \"acc_norm\": 0.6286764705882353,\n \"acc_norm_stderr\": 0.029349803139765873\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.630718954248366,\n \"acc_stderr\": 0.01952431674486635,\n \
\ \"acc_norm\": 0.630718954248366,\n \"acc_norm_stderr\": 0.01952431674486635\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.710204081632653,\n \"acc_stderr\": 0.02904308868330432,\n\
\ \"acc_norm\": 0.710204081632653,\n \"acc_norm_stderr\": 0.02904308868330432\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8606965174129353,\n\
\ \"acc_stderr\": 0.024484487162913973,\n \"acc_norm\": 0.8606965174129353,\n\
\ \"acc_norm_stderr\": 0.024484487162913973\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.89,\n \"acc_stderr\": 0.03144660377352203,\n \
\ \"acc_norm\": 0.89,\n \"acc_norm_stderr\": 0.03144660377352203\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5120481927710844,\n\
\ \"acc_stderr\": 0.038913644958358154,\n \"acc_norm\": 0.5120481927710844,\n\
\ \"acc_norm_stderr\": 0.038913644958358154\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640044,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640044\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3806609547123623,\n\
\ \"mc1_stderr\": 0.016997627871907926,\n \"mc2\": 0.5410727637055827,\n\
\ \"mc2_stderr\": 0.015496875016507235\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8255722178374112,\n \"acc_stderr\": 0.010665187902498431\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.5981804397270659,\n \
\ \"acc_stderr\": 0.013504357787494042\n }\n}\n```"
repo_url: https://huggingface.co/fhai50032/RP-Coder-SM3
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-17-29.858673.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-22-18.963453.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-22-18.963453.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- '**/details_harness|winogrande|5_2024-03-10T07-17-29.858673.parquet'
- split: 2024_03_10T07_22_18.963453
path:
- '**/details_harness|winogrande|5_2024-03-10T07-22-18.963453.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T07-22-18.963453.parquet'
- config_name: results
data_files:
- split: 2024_03_10T07_17_29.858673
path:
- results_2024-03-10T07-17-29.858673.parquet
- split: 2024_03_10T07_22_18.963453
path:
- results_2024-03-10T07-22-18.963453.parquet
- split: latest
path:
- results_2024-03-10T07-22-18.963453.parquet
---
# Dataset Card for Evaluation run of fhai50032/RP-Coder-SM3
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [fhai50032/RP-Coder-SM3](https://huggingface.co/fhai50032/RP-Coder-SM3) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_fhai50032__RP-Coder-SM3",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T07:22:18.963453](https://huggingface.co/datasets/open-llm-leaderboard/details_fhai50032__RP-Coder-SM3/blob/main/results_2024-03-10T07-22-18.963453.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6359839145935902,
"acc_stderr": 0.03226872295819599,
"acc_norm": 0.6373484968136244,
"acc_norm_stderr": 0.03292829056146491,
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5410727637055827,
"mc2_stderr": 0.015496875016507235
},
"harness|arc:challenge|25": {
"acc": 0.6160409556313993,
"acc_stderr": 0.01421244498065189,
"acc_norm": 0.6561433447098977,
"acc_norm_stderr": 0.013880644570156213
},
"harness|hellaswag|10": {
"acc": 0.6499701254730134,
"acc_stderr": 0.004760041843651487,
"acc_norm": 0.842162915753834,
"acc_norm_stderr": 0.003638430620613939
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6,
"acc_stderr": 0.04232073695151589,
"acc_norm": 0.6,
"acc_norm_stderr": 0.04232073695151589
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7039473684210527,
"acc_stderr": 0.03715062154998904,
"acc_norm": 0.7039473684210527,
"acc_norm_stderr": 0.03715062154998904
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6805555555555556,
"acc_stderr": 0.03899073687357335,
"acc_norm": 0.6805555555555556,
"acc_norm_stderr": 0.03899073687357335
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6069364161849711,
"acc_stderr": 0.03724249595817731,
"acc_norm": 0.6069364161849711,
"acc_norm_stderr": 0.03724249595817731
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.37254901960784315,
"acc_stderr": 0.048108401480826346,
"acc_norm": 0.37254901960784315,
"acc_norm_stderr": 0.048108401480826346
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4473684210526316,
"acc_stderr": 0.04677473004491199,
"acc_norm": 0.4473684210526316,
"acc_norm_stderr": 0.04677473004491199
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5724137931034483,
"acc_stderr": 0.041227371113703316,
"acc_norm": 0.5724137931034483,
"acc_norm_stderr": 0.041227371113703316
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.02540255550326091,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.02540255550326091
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7612903225806451,
"acc_stderr": 0.02425107126220884,
"acc_norm": 0.7612903225806451,
"acc_norm_stderr": 0.02425107126220884
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.03192271569548302,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.03192271569548302
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.803030303030303,
"acc_stderr": 0.028335609732463362,
"acc_norm": 0.803030303030303,
"acc_norm_stderr": 0.028335609732463362
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.337037037037037,
"acc_stderr": 0.028820884666253255,
"acc_norm": 0.337037037037037,
"acc_norm_stderr": 0.028820884666253255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.029953823891887044,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.029953823891887044
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242742,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242742
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.015848255806501562,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.015848255806501562
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.03408655867977749,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.03408655867977749
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.803921568627451,
"acc_stderr": 0.027865942286639325,
"acc_norm": 0.803921568627451,
"acc_norm_stderr": 0.027865942286639325
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7040358744394619,
"acc_stderr": 0.030636591348699803,
"acc_norm": 0.7040358744394619,
"acc_norm_stderr": 0.030636591348699803
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7175572519083969,
"acc_stderr": 0.03948406125768362,
"acc_norm": 0.7175572519083969,
"acc_norm_stderr": 0.03948406125768362
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8264462809917356,
"acc_stderr": 0.03457272836917671,
"acc_norm": 0.8264462809917356,
"acc_norm_stderr": 0.03457272836917671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.49107142857142855,
"acc_stderr": 0.04745033255489123,
"acc_norm": 0.49107142857142855,
"acc_norm_stderr": 0.04745033255489123
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406953,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406953
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8071519795657727,
"acc_stderr": 0.014108533515757433,
"acc_norm": 0.8071519795657727,
"acc_norm_stderr": 0.014108533515757433
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6994219653179191,
"acc_stderr": 0.024685316867257803,
"acc_norm": 0.6994219653179191,
"acc_norm_stderr": 0.024685316867257803
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.01618544417945717,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.01618544417945717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.025553169991826528,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.025553169991826528
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7234726688102894,
"acc_stderr": 0.02540383297817961,
"acc_norm": 0.7234726688102894,
"acc_norm_stderr": 0.02540383297817961
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6975308641975309,
"acc_stderr": 0.02555765398186806,
"acc_norm": 0.6975308641975309,
"acc_norm_stderr": 0.02555765398186806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.42907801418439717,
"acc_stderr": 0.029525914302558562,
"acc_norm": 0.42907801418439717,
"acc_norm_stderr": 0.029525914302558562
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4634941329856584,
"acc_stderr": 0.012736153390214961,
"acc_norm": 0.4634941329856584,
"acc_norm_stderr": 0.012736153390214961
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6286764705882353,
"acc_stderr": 0.029349803139765873,
"acc_norm": 0.6286764705882353,
"acc_norm_stderr": 0.029349803139765873
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.630718954248366,
"acc_stderr": 0.01952431674486635,
"acc_norm": 0.630718954248366,
"acc_norm_stderr": 0.01952431674486635
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.710204081632653,
"acc_stderr": 0.02904308868330432,
"acc_norm": 0.710204081632653,
"acc_norm_stderr": 0.02904308868330432
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8606965174129353,
"acc_stderr": 0.024484487162913973,
"acc_norm": 0.8606965174129353,
"acc_norm_stderr": 0.024484487162913973
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.89,
"acc_stderr": 0.03144660377352203,
"acc_norm": 0.89,
"acc_norm_stderr": 0.03144660377352203
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5120481927710844,
"acc_stderr": 0.038913644958358154,
"acc_norm": 0.5120481927710844,
"acc_norm_stderr": 0.038913644958358154
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640044,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640044
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3806609547123623,
"mc1_stderr": 0.016997627871907926,
"mc2": 0.5410727637055827,
"mc2_stderr": 0.015496875016507235
},
"harness|winogrande|5": {
"acc": 0.8255722178374112,
"acc_stderr": 0.010665187902498431
},
"harness|gsm8k|5": {
"acc": 0.5981804397270659,
"acc_stderr": 0.013504357787494042
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
dce9112/dataset_LLaMa2_test | dce9112 | "2024-03-10T07:28:53Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T07:28:08Z" | ---
license: mit
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 3890
num_examples: 32
download_size: 3144
dataset_size: 3890
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_delayedkarma__NeuralHermes-2.5-Mistral-7B | open-llm-leaderboard-old | "2024-03-10T07:58:03Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T07:57:43Z" | ---
pretty_name: Evaluation run of delayedkarma/NeuralHermes-2.5-Mistral-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [delayedkarma/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/delayedkarma/NeuralHermes-2.5-Mistral-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T07:55:22.452252](https://huggingface.co/datasets/open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-03-10T07-55-22.452252.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.631917558240621,\n\
\ \"acc_stderr\": 0.032259628416007755,\n \"acc_norm\": 0.6382580775427679,\n\
\ \"acc_norm_stderr\": 0.03290869969358224,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5383196217878969,\n\
\ \"mc2_stderr\": 0.015342261550018428\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6245733788395904,\n \"acc_stderr\": 0.014150631435111726,\n\
\ \"acc_norm\": 0.6655290102389079,\n \"acc_norm_stderr\": 0.013787460322441374\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6605257916749652,\n\
\ \"acc_stderr\": 0.004725630911520331,\n \"acc_norm\": 0.8500298745269866,\n\
\ \"acc_norm_stderr\": 0.0035631244274585212\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252606,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252606\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353227,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353227\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.0286372356398009,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.0286372356398009\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.05016135580465919,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.05016135580465919\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6184971098265896,\n\
\ \"acc_stderr\": 0.03703851193099521,\n \"acc_norm\": 0.6184971098265896,\n\
\ \"acc_norm_stderr\": 0.03703851193099521\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3627450980392157,\n \"acc_stderr\": 0.04784060704105653,\n\
\ \"acc_norm\": 0.3627450980392157,\n \"acc_norm_stderr\": 0.04784060704105653\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.548936170212766,\n \"acc_stderr\": 0.03252909619613197,\n\
\ \"acc_norm\": 0.548936170212766,\n \"acc_norm_stderr\": 0.03252909619613197\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.49122807017543857,\n\
\ \"acc_stderr\": 0.04702880432049615,\n \"acc_norm\": 0.49122807017543857,\n\
\ \"acc_norm_stderr\": 0.04702880432049615\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5241379310344828,\n \"acc_stderr\": 0.0416180850350153,\n\
\ \"acc_norm\": 0.5241379310344828,\n \"acc_norm_stderr\": 0.0416180850350153\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055266,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055266\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4523809523809524,\n\
\ \"acc_stderr\": 0.044518079590553275,\n \"acc_norm\": 0.4523809523809524,\n\
\ \"acc_norm_stderr\": 0.044518079590553275\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7838709677419354,\n\
\ \"acc_stderr\": 0.023415293433568532,\n \"acc_norm\": 0.7838709677419354,\n\
\ \"acc_norm_stderr\": 0.023415293433568532\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5320197044334976,\n \"acc_stderr\": 0.03510766597959215,\n\
\ \"acc_norm\": 0.5320197044334976,\n \"acc_norm_stderr\": 0.03510766597959215\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.793939393939394,\n \"acc_stderr\": 0.031584153240477114,\n\
\ \"acc_norm\": 0.793939393939394,\n \"acc_norm_stderr\": 0.031584153240477114\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7929292929292929,\n \"acc_stderr\": 0.02886977846026704,\n \"\
acc_norm\": 0.7929292929292929,\n \"acc_norm_stderr\": 0.02886977846026704\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8911917098445595,\n \"acc_stderr\": 0.022473253332768776,\n\
\ \"acc_norm\": 0.8911917098445595,\n \"acc_norm_stderr\": 0.022473253332768776\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.024756000382130956,\n\
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.024756000382130956\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3037037037037037,\n \"acc_stderr\": 0.02803792996911499,\n \
\ \"acc_norm\": 0.3037037037037037,\n \"acc_norm_stderr\": 0.02803792996911499\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.03068473711513537,\n \
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.03068473711513537\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8366972477064221,\n \"acc_stderr\": 0.01584825580650155,\n \"\
acc_norm\": 0.8366972477064221,\n \"acc_norm_stderr\": 0.01584825580650155\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49537037037037035,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.49537037037037035,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.028626547912437413,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.028626547912437413\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.810126582278481,\n \"acc_stderr\": 0.02553010046023349,\n \
\ \"acc_norm\": 0.810126582278481,\n \"acc_norm_stderr\": 0.02553010046023349\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6995515695067265,\n\
\ \"acc_stderr\": 0.030769352008229143,\n \"acc_norm\": 0.6995515695067265,\n\
\ \"acc_norm_stderr\": 0.030769352008229143\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7914110429447853,\n \"acc_stderr\": 0.031921934489347235,\n\
\ \"acc_norm\": 0.7914110429447853,\n \"acc_norm_stderr\": 0.031921934489347235\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5,\n\
\ \"acc_stderr\": 0.04745789978762494,\n \"acc_norm\": 0.5,\n \
\ \"acc_norm_stderr\": 0.04745789978762494\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8199233716475096,\n\
\ \"acc_stderr\": 0.013740797258579823,\n \"acc_norm\": 0.8199233716475096,\n\
\ \"acc_norm_stderr\": 0.013740797258579823\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3128491620111732,\n\
\ \"acc_stderr\": 0.015506892594647267,\n \"acc_norm\": 0.3128491620111732,\n\
\ \"acc_norm_stderr\": 0.015506892594647267\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7418300653594772,\n \"acc_stderr\": 0.02505850331695814,\n\
\ \"acc_norm\": 0.7418300653594772,\n \"acc_norm_stderr\": 0.02505850331695814\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6945337620578779,\n\
\ \"acc_stderr\": 0.026160584450140446,\n \"acc_norm\": 0.6945337620578779,\n\
\ \"acc_norm_stderr\": 0.026160584450140446\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7530864197530864,\n \"acc_stderr\": 0.02399350170904211,\n\
\ \"acc_norm\": 0.7530864197530864,\n \"acc_norm_stderr\": 0.02399350170904211\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.524822695035461,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.524822695035461,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045699,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045699\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396546,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396546\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7387755102040816,\n \"acc_stderr\": 0.028123429335142773,\n\
\ \"acc_norm\": 0.7387755102040816,\n \"acc_norm_stderr\": 0.028123429335142773\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8109452736318408,\n\
\ \"acc_stderr\": 0.027686913588013024,\n \"acc_norm\": 0.8109452736318408,\n\
\ \"acc_norm_stderr\": 0.027686913588013024\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046046,\n \"mc2\": 0.5383196217878969,\n\
\ \"mc2_stderr\": 0.015342261550018428\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7797947908445146,\n \"acc_stderr\": 0.011646276755089688\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.34040940106141016,\n \
\ \"acc_stderr\": 0.013052097103299099\n }\n}\n```"
repo_url: https://huggingface.co/delayedkarma/NeuralHermes-2.5-Mistral-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T07-55-22.452252.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- '**/details_harness|winogrande|5_2024-03-10T07-55-22.452252.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T07-55-22.452252.parquet'
- config_name: results
data_files:
- split: 2024_03_10T07_55_22.452252
path:
- results_2024-03-10T07-55-22.452252.parquet
- split: latest
path:
- results_2024-03-10T07-55-22.452252.parquet
---
# Dataset Card for Evaluation run of delayedkarma/NeuralHermes-2.5-Mistral-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [delayedkarma/NeuralHermes-2.5-Mistral-7B](https://huggingface.co/delayedkarma/NeuralHermes-2.5-Mistral-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T07:55:22.452252](https://huggingface.co/datasets/open-llm-leaderboard/details_delayedkarma__NeuralHermes-2.5-Mistral-7B/blob/main/results_2024-03-10T07-55-22.452252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.631917558240621,
"acc_stderr": 0.032259628416007755,
"acc_norm": 0.6382580775427679,
"acc_norm_stderr": 0.03290869969358224,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5383196217878969,
"mc2_stderr": 0.015342261550018428
},
"harness|arc:challenge|25": {
"acc": 0.6245733788395904,
"acc_stderr": 0.014150631435111726,
"acc_norm": 0.6655290102389079,
"acc_norm_stderr": 0.013787460322441374
},
"harness|hellaswag|10": {
"acc": 0.6605257916749652,
"acc_stderr": 0.004725630911520331,
"acc_norm": 0.8500298745269866,
"acc_norm_stderr": 0.0035631244274585212
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252606,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252606
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353227,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353227
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.0286372356398009,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.0286372356398009
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.05016135580465919,
"acc_norm": 0.47,
"acc_norm_stderr": 0.05016135580465919
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6184971098265896,
"acc_stderr": 0.03703851193099521,
"acc_norm": 0.6184971098265896,
"acc_norm_stderr": 0.03703851193099521
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3627450980392157,
"acc_stderr": 0.04784060704105653,
"acc_norm": 0.3627450980392157,
"acc_norm_stderr": 0.04784060704105653
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.548936170212766,
"acc_stderr": 0.03252909619613197,
"acc_norm": 0.548936170212766,
"acc_norm_stderr": 0.03252909619613197
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5241379310344828,
"acc_stderr": 0.0416180850350153,
"acc_norm": 0.5241379310344828,
"acc_norm_stderr": 0.0416180850350153
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055266,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055266
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4523809523809524,
"acc_stderr": 0.044518079590553275,
"acc_norm": 0.4523809523809524,
"acc_norm_stderr": 0.044518079590553275
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7838709677419354,
"acc_stderr": 0.023415293433568532,
"acc_norm": 0.7838709677419354,
"acc_norm_stderr": 0.023415293433568532
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5320197044334976,
"acc_stderr": 0.03510766597959215,
"acc_norm": 0.5320197044334976,
"acc_norm_stderr": 0.03510766597959215
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.793939393939394,
"acc_stderr": 0.031584153240477114,
"acc_norm": 0.793939393939394,
"acc_norm_stderr": 0.031584153240477114
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7929292929292929,
"acc_stderr": 0.02886977846026704,
"acc_norm": 0.7929292929292929,
"acc_norm_stderr": 0.02886977846026704
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8911917098445595,
"acc_stderr": 0.022473253332768776,
"acc_norm": 0.8911917098445595,
"acc_norm_stderr": 0.022473253332768776
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.024756000382130956,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.024756000382130956
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3037037037037037,
"acc_stderr": 0.02803792996911499,
"acc_norm": 0.3037037037037037,
"acc_norm_stderr": 0.02803792996911499
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.03068473711513537,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.03068473711513537
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8366972477064221,
"acc_stderr": 0.01584825580650155,
"acc_norm": 0.8366972477064221,
"acc_norm_stderr": 0.01584825580650155
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49537037037037035,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.49537037037037035,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.028626547912437413,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.028626547912437413
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.810126582278481,
"acc_stderr": 0.02553010046023349,
"acc_norm": 0.810126582278481,
"acc_norm_stderr": 0.02553010046023349
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6995515695067265,
"acc_stderr": 0.030769352008229143,
"acc_norm": 0.6995515695067265,
"acc_norm_stderr": 0.030769352008229143
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7914110429447853,
"acc_stderr": 0.031921934489347235,
"acc_norm": 0.7914110429447853,
"acc_norm_stderr": 0.031921934489347235
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5,
"acc_stderr": 0.04745789978762494,
"acc_norm": 0.5,
"acc_norm_stderr": 0.04745789978762494
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8199233716475096,
"acc_stderr": 0.013740797258579823,
"acc_norm": 0.8199233716475096,
"acc_norm_stderr": 0.013740797258579823
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3128491620111732,
"acc_stderr": 0.015506892594647267,
"acc_norm": 0.3128491620111732,
"acc_norm_stderr": 0.015506892594647267
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7418300653594772,
"acc_stderr": 0.02505850331695814,
"acc_norm": 0.7418300653594772,
"acc_norm_stderr": 0.02505850331695814
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6945337620578779,
"acc_stderr": 0.026160584450140446,
"acc_norm": 0.6945337620578779,
"acc_norm_stderr": 0.026160584450140446
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7530864197530864,
"acc_stderr": 0.02399350170904211,
"acc_norm": 0.7530864197530864,
"acc_norm_stderr": 0.02399350170904211
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.524822695035461,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.524822695035461,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045699,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045699
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396546,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396546
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7387755102040816,
"acc_stderr": 0.028123429335142773,
"acc_norm": 0.7387755102040816,
"acc_norm_stderr": 0.028123429335142773
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8109452736318408,
"acc_stderr": 0.027686913588013024,
"acc_norm": 0.8109452736318408,
"acc_norm_stderr": 0.027686913588013024
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046046,
"mc2": 0.5383196217878969,
"mc2_stderr": 0.015342261550018428
},
"harness|winogrande|5": {
"acc": 0.7797947908445146,
"acc_stderr": 0.011646276755089688
},
"harness|gsm8k|5": {
"acc": 0.34040940106141016,
"acc_stderr": 0.013052097103299099
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
David-Xu/astronomy-stack-dpo-20-percent | David-Xu | "2024-03-10T08:01:01Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T08:00:53Z" | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: chosen
list:
- name: content
dtype: string
- name: role
dtype: string
- name: rejected
list:
- name: content
dtype: string
- name: role
dtype: string
- name: score_question
dtype: string
- name: score_chosen
dtype: string
- name: score_rejected
dtype: string
splits:
- name: train
num_bytes: 11275712.334687736
num_examples: 3588
- name: test
num_bytes: 1250761.8476047153
num_examples: 398
download_size: 3377877
dataset_size: 12526474.18229245
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
mrid124/mp4 | mrid124 | "2024-03-14T04:59:36Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-03-10T08:09:00Z" | ---
license: apache-2.0
---
|
convaiinnovations/Nadi_Indic466k_Instruct | convaiinnovations | "2024-03-10T09:31:24Z" | 0 | 2 | [
"task_categories:text-generation",
"language:hi",
"language:pa",
"language:bn",
"language:ta",
"language:te",
"language:mr",
"language:gu",
"language:ur",
"language:kn",
"language:ml",
"language:or",
"language:as",
"language:sa",
"language:sd",
"language:ne",
"language:si",
"license:apache-2.0",
"size_categories:100K<n<1M",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"code"
] | [
"text-generation"
] | "2024-03-10T08:12:16Z" | ---
license: apache-2.0
task_categories:
- text-generation
language:
- hi
- pa
- bn
- ta
- te
- mr
- gu
- ur
- kn
- ml
- or
- as
- sa
- sd
- ne
- si
tags:
- code
size_categories:
- 100K<n<1M
---
# Nadi_Indic466K_Instruct Dataset
The Nadi_Indic466K_Instruct dataset is the world's first coding dataset with 18 Indian language support, 466k rows and 142 Million total tokens. This dataset can be used by developers to build Indian coding language models (LLMs) for various programming languages.
Q-LoRA based SFT/PPO/DPO fine-tuning can be done on the dataset in LLAMA-2 or Mistral or any opens-soure LLM for text generation.
The dataset was carefully curated such that the coding part remains in English and rest in the desired language.
## Dataset Details
- Total tokens in Hindi: 1,609,056 tokens
- Total tokens in Punjabi: 13,472,644 tokens
- Total tokens in Bengali: 11,514,502 tokens
- Total tokens in Tamil: 10,025,914 tokens
- Total tokens in Telugu: 1,943,389 tokens
- Total tokens in Marathi: 10,826,335 tokens
- Total tokens in Gujarati: 2,126,480 tokens
- Total tokens in Urdu: 2,675,491 tokens
- Total tokens in Kannada: 9,977,750 tokens
- Total tokens in Malayalam: 9,667,277 tokens
- Total tokens in Odia: 11,452,624 tokens
- Total tokens in Assamese: 1,944,119 tokens
- Total tokens in Sanskrit: 11,445,658 tokens
- Total tokens in Maithili: 7,203,251 tokens
- Total tokens in Bhojpuri: 11,099,822 tokens
- Total tokens in Sindhi: 13,536,792 tokens
- Total tokens in Nepali: 11,155,856 tokens
- Total tokens in Sinhala: 353,556 tokens
## Supported Languages
The Nadi_Indic466K_Instruct dataset supports the following Indian languages along with their language codes:
- `hi`: Hindi
- `pa`: Punjabi
- `bn`: Bengali
- `ta`: Tamil
- `te`: Telugu
- `mr`: Marathi
- `gu`: Gujarati
- `ur`: Urdu
- `kn`: Kannada
- `ml`: Malayalam
- `or`: Odia
- `as`: Assamese
- `sa`: Sanskrit
- `mai`: Maithili
- `bho`: Bhojpuri
- `sd`: Sindhi
- `ne`: Nepali
- `si`: Sinhala
## Potential Applications
The Nadi_Indic466K_Instruct dataset can be used for various applications, including:
1. Building Indian language-based large language models (LLMs) for coding.
2. Fine-tuning LLMs on LLAMA-2, Mistral, or any other open-source LLM.
3. Supporting programming languages such as Python, C, C++, Java, PHP, C#, TypeScript, Kotlin, SQL, Dart, Ruby, Bash, and more.
By leveraging this dataset, developers can create more reliable and accurate coding language models that incorporate Indian languages.
This dataset is provided by ConvAI Innovations Pvt. Ltd. (2024).
## Citation
If you made some AI model on top of this dataset or modify/combine the dataset with your own, don't forget to cite the dataset, to cite please use the following format:
@misc{nadi_indic466k_instruct_dataset_2024,
author = {ConvAI Innovations Pvt. Ltd.},
title = {Nadi_Indic466K_Instruct Dataset},
year = {2024},
url = {[https://huggingface.co/datasets/nandakishor597/Nadi_Indic466k_Instruct]}
} |
jjjaehee/customhkcode2 | jjjaehee | "2024-03-10T08:23:10Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T08:22:41Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
iNeil77/OBF_tokenizer_dataset | iNeil77 | "2024-03-10T08:39:50Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T08:34:25Z" | ---
dataset_info:
features:
- name: content
dtype: string
splits:
- name: train
num_bytes: 6139184788.960412
num_examples: 1500000
download_size: 3779010022
dataset_size: 6139184788.960412
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
AkashMnd/prismadgen | AkashMnd | "2024-03-10T08:35:49Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T08:35:30Z" | ---
license: mit
---
|
natnitaract/h2o-kaggel-llm-science-exam-2023-rag | natnitaract | "2024-05-10T17:03:44Z" | 0 | 1 | [
"language:en",
"license:cc-by-3.0",
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T08:39:41Z" | ---
license: cc-by-3.0
language:
- en
co-author: Teetouch Jaknamon
--- |
open-llm-leaderboard-old/details_TeeZee__GALAXY-XB-v.01 | open-llm-leaderboard-old | "2024-03-10T08:41:16Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T08:40:54Z" | ---
pretty_name: Evaluation run of TeeZee/GALAXY-XB-v.01
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [TeeZee/GALAXY-XB-v.01](https://huggingface.co/TeeZee/GALAXY-XB-v.01) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T08:38:37.798892](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01/blob/main/results_2024-03-10T08-38-37.798892.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6489294861573784,\n\
\ \"acc_stderr\": 0.031883763657466264,\n \"acc_norm\": 0.6533932757026093,\n\
\ \"acc_norm_stderr\": 0.03252706366243769,\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.4367256901069689,\n\
\ \"mc2_stderr\": 0.014358645276062254\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6092150170648464,\n \"acc_norm_stderr\": 0.01425856388051378\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6401115315674168,\n\
\ \"acc_stderr\": 0.004789865379084518,\n \"acc_norm\": 0.8292172873929496,\n\
\ \"acc_norm_stderr\": 0.003755498941781852\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.04943110704237103,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.04943110704237103\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.04218506215368879,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.04218506215368879\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.68,\n\
\ \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n \
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.690566037735849,\n \"acc_stderr\": 0.02845015479411864,\n\
\ \"acc_norm\": 0.690566037735849,\n \"acc_norm_stderr\": 0.02845015479411864\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7569444444444444,\n\
\ \"acc_stderr\": 0.03586879280080339,\n \"acc_norm\": 0.7569444444444444,\n\
\ \"acc_norm_stderr\": 0.03586879280080339\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.04988876515698589,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.04988876515698589\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \"\
acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247078,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247078\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\": 0.77,\n\
\ \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6042553191489362,\n \"acc_stderr\": 0.031967586978353627,\n\
\ \"acc_norm\": 0.6042553191489362,\n \"acc_norm_stderr\": 0.031967586978353627\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4824561403508772,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.4824561403508772,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.040824829046386284,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.040824829046386284\n \
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4312169312169312,\n \"acc_stderr\": 0.025506481698138198,\n \"\
acc_norm\": 0.4312169312169312,\n \"acc_norm_stderr\": 0.025506481698138198\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.8225806451612904,\n \"acc_stderr\": 0.02173254068932928,\n \"\
acc_norm\": 0.8225806451612904,\n \"acc_norm_stderr\": 0.02173254068932928\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.45320197044334976,\n \"acc_stderr\": 0.03502544650845872,\n \"\
acc_norm\": 0.45320197044334976,\n \"acc_norm_stderr\": 0.03502544650845872\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.63,\n \"acc_stderr\": 0.04852365870939098,\n \"acc_norm\"\
: 0.63,\n \"acc_norm_stderr\": 0.04852365870939098\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7636363636363637,\n \"acc_stderr\": 0.03317505930009181,\n\
\ \"acc_norm\": 0.7636363636363637,\n \"acc_norm_stderr\": 0.03317505930009181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8333333333333334,\n \"acc_stderr\": 0.026552207828215282,\n \"\
acc_norm\": 0.8333333333333334,\n \"acc_norm_stderr\": 0.026552207828215282\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.676923076923077,\n \"acc_stderr\": 0.023710888501970565,\n \
\ \"acc_norm\": 0.676923076923077,\n \"acc_norm_stderr\": 0.023710888501970565\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.37748344370860926,\n \"acc_stderr\": 0.0395802723112157,\n \"\
acc_norm\": 0.37748344370860926,\n \"acc_norm_stderr\": 0.0395802723112157\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374307,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374307\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.625,\n \"acc_stderr\": 0.033016908987210894,\n \"acc_norm\": 0.625,\n\
\ \"acc_norm_stderr\": 0.033016908987210894\n },\n \"harness|hendrycksTest-high_school_us_history|5\"\
: {\n \"acc\": 0.8627450980392157,\n \"acc_stderr\": 0.02415222596280158,\n\
\ \"acc_norm\": 0.8627450980392157,\n \"acc_norm_stderr\": 0.02415222596280158\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8143459915611815,\n \"acc_stderr\": 0.02531049537694485,\n \
\ \"acc_norm\": 0.8143459915611815,\n \"acc_norm_stderr\": 0.02531049537694485\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.03768335959728744,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.03768335959728744\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8016528925619835,\n \"acc_stderr\": 0.03640118271990947,\n \"\
acc_norm\": 0.8016528925619835,\n \"acc_norm_stderr\": 0.03640118271990947\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.41964285714285715,\n\
\ \"acc_stderr\": 0.04684099321077106,\n \"acc_norm\": 0.41964285714285715,\n\
\ \"acc_norm_stderr\": 0.04684099321077106\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8252427184466019,\n \"acc_stderr\": 0.03760178006026621,\n\
\ \"acc_norm\": 0.8252427184466019,\n \"acc_norm_stderr\": 0.03760178006026621\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.02250903393707781,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.02250903393707781\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8135376756066411,\n\
\ \"acc_stderr\": 0.013927751372001506,\n \"acc_norm\": 0.8135376756066411,\n\
\ \"acc_norm_stderr\": 0.013927751372001506\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.024332146779134117,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.024332146779134117\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2636871508379888,\n\
\ \"acc_stderr\": 0.014736926383761983,\n \"acc_norm\": 0.2636871508379888,\n\
\ \"acc_norm_stderr\": 0.014736926383761983\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7581699346405228,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.7581699346405228,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6913183279742765,\n\
\ \"acc_stderr\": 0.026236965881153262,\n \"acc_norm\": 0.6913183279742765,\n\
\ \"acc_norm_stderr\": 0.026236965881153262\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135128,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135128\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.475177304964539,\n \"acc_stderr\": 0.029790719243829714,\n \
\ \"acc_norm\": 0.475177304964539,\n \"acc_norm_stderr\": 0.029790719243829714\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4921773142112125,\n\
\ \"acc_stderr\": 0.012768673076111903,\n \"acc_norm\": 0.4921773142112125,\n\
\ \"acc_norm_stderr\": 0.012768673076111903\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.7095588235294118,\n \"acc_stderr\": 0.02757646862274053,\n\
\ \"acc_norm\": 0.7095588235294118,\n \"acc_norm_stderr\": 0.02757646862274053\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.018850084696468705,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.018850084696468705\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7181818181818181,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.7181818181818181,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578323,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578323\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.536144578313253,\n\
\ \"acc_stderr\": 0.038823108508905954,\n \"acc_norm\": 0.536144578313253,\n\
\ \"acc_norm_stderr\": 0.038823108508905954\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8362573099415205,\n \"acc_stderr\": 0.028380919596145866,\n\
\ \"acc_norm\": 0.8362573099415205,\n \"acc_norm_stderr\": 0.028380919596145866\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2766217870257038,\n\
\ \"mc1_stderr\": 0.015659605755326923,\n \"mc2\": 0.4367256901069689,\n\
\ \"mc2_stderr\": 0.014358645276062254\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8113654301499605,\n \"acc_stderr\": 0.010995172318019806\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.43442001516300227,\n \
\ \"acc_stderr\": 0.013653507211411403\n }\n}\n```"
repo_url: https://huggingface.co/TeeZee/GALAXY-XB-v.01
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|arc:challenge|25_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|gsm8k|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hellaswag|10_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T08-38-37.798892.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- '**/details_harness|winogrande|5_2024-03-10T08-38-37.798892.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T08-38-37.798892.parquet'
- config_name: results
data_files:
- split: 2024_03_10T08_38_37.798892
path:
- results_2024-03-10T08-38-37.798892.parquet
- split: latest
path:
- results_2024-03-10T08-38-37.798892.parquet
---
# Dataset Card for Evaluation run of TeeZee/GALAXY-XB-v.01
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [TeeZee/GALAXY-XB-v.01](https://huggingface.co/TeeZee/GALAXY-XB-v.01) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T08:38:37.798892](https://huggingface.co/datasets/open-llm-leaderboard/details_TeeZee__GALAXY-XB-v.01/blob/main/results_2024-03-10T08-38-37.798892.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6489294861573784,
"acc_stderr": 0.031883763657466264,
"acc_norm": 0.6533932757026093,
"acc_norm_stderr": 0.03252706366243769,
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.4367256901069689,
"mc2_stderr": 0.014358645276062254
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6092150170648464,
"acc_norm_stderr": 0.01425856388051378
},
"harness|hellaswag|10": {
"acc": 0.6401115315674168,
"acc_stderr": 0.004789865379084518,
"acc_norm": 0.8292172873929496,
"acc_norm_stderr": 0.003755498941781852
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.41,
"acc_stderr": 0.04943110704237103,
"acc_norm": 0.41,
"acc_norm_stderr": 0.04943110704237103
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.04218506215368879,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.04218506215368879
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.690566037735849,
"acc_stderr": 0.02845015479411864,
"acc_norm": 0.690566037735849,
"acc_norm_stderr": 0.02845015479411864
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7569444444444444,
"acc_stderr": 0.03586879280080339,
"acc_norm": 0.7569444444444444,
"acc_norm_stderr": 0.03586879280080339
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.44,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.44,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247078,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247078
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6042553191489362,
"acc_stderr": 0.031967586978353627,
"acc_norm": 0.6042553191489362,
"acc_norm_stderr": 0.031967586978353627
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4824561403508772,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.4824561403508772,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6,
"acc_stderr": 0.040824829046386284,
"acc_norm": 0.6,
"acc_norm_stderr": 0.040824829046386284
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4312169312169312,
"acc_stderr": 0.025506481698138198,
"acc_norm": 0.4312169312169312,
"acc_norm_stderr": 0.025506481698138198
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8225806451612904,
"acc_stderr": 0.02173254068932928,
"acc_norm": 0.8225806451612904,
"acc_norm_stderr": 0.02173254068932928
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.45320197044334976,
"acc_stderr": 0.03502544650845872,
"acc_norm": 0.45320197044334976,
"acc_norm_stderr": 0.03502544650845872
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.63,
"acc_stderr": 0.04852365870939098,
"acc_norm": 0.63,
"acc_norm_stderr": 0.04852365870939098
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7636363636363637,
"acc_stderr": 0.03317505930009181,
"acc_norm": 0.7636363636363637,
"acc_norm_stderr": 0.03317505930009181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8333333333333334,
"acc_stderr": 0.026552207828215282,
"acc_norm": 0.8333333333333334,
"acc_norm_stderr": 0.026552207828215282
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.676923076923077,
"acc_stderr": 0.023710888501970565,
"acc_norm": 0.676923076923077,
"acc_norm_stderr": 0.023710888501970565
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.37748344370860926,
"acc_stderr": 0.0395802723112157,
"acc_norm": 0.37748344370860926,
"acc_norm_stderr": 0.0395802723112157
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374307,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374307
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.625,
"acc_stderr": 0.033016908987210894,
"acc_norm": 0.625,
"acc_norm_stderr": 0.033016908987210894
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8627450980392157,
"acc_stderr": 0.02415222596280158,
"acc_norm": 0.8627450980392157,
"acc_norm_stderr": 0.02415222596280158
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8143459915611815,
"acc_stderr": 0.02531049537694485,
"acc_norm": 0.8143459915611815,
"acc_norm_stderr": 0.02531049537694485
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.03768335959728744,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.03768335959728744
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8016528925619835,
"acc_stderr": 0.03640118271990947,
"acc_norm": 0.8016528925619835,
"acc_norm_stderr": 0.03640118271990947
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.41964285714285715,
"acc_stderr": 0.04684099321077106,
"acc_norm": 0.41964285714285715,
"acc_norm_stderr": 0.04684099321077106
},
"harness|hendrycksTest-management|5": {
"acc": 0.8252427184466019,
"acc_stderr": 0.03760178006026621,
"acc_norm": 0.8252427184466019,
"acc_norm_stderr": 0.03760178006026621
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.02250903393707781,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.02250903393707781
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8135376756066411,
"acc_stderr": 0.013927751372001506,
"acc_norm": 0.8135376756066411,
"acc_norm_stderr": 0.013927751372001506
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.024332146779134117,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.024332146779134117
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2636871508379888,
"acc_stderr": 0.014736926383761983,
"acc_norm": 0.2636871508379888,
"acc_norm_stderr": 0.014736926383761983
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7581699346405228,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.7581699346405228,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6913183279742765,
"acc_stderr": 0.026236965881153262,
"acc_norm": 0.6913183279742765,
"acc_norm_stderr": 0.026236965881153262
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135128,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135128
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.475177304964539,
"acc_stderr": 0.029790719243829714,
"acc_norm": 0.475177304964539,
"acc_norm_stderr": 0.029790719243829714
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4921773142112125,
"acc_stderr": 0.012768673076111903,
"acc_norm": 0.4921773142112125,
"acc_norm_stderr": 0.012768673076111903
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.7095588235294118,
"acc_stderr": 0.02757646862274053,
"acc_norm": 0.7095588235294118,
"acc_norm_stderr": 0.02757646862274053
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.018850084696468705,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.018850084696468705
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7181818181818181,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.7181818181818181,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578323,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578323
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.536144578313253,
"acc_stderr": 0.038823108508905954,
"acc_norm": 0.536144578313253,
"acc_norm_stderr": 0.038823108508905954
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8362573099415205,
"acc_stderr": 0.028380919596145866,
"acc_norm": 0.8362573099415205,
"acc_norm_stderr": 0.028380919596145866
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2766217870257038,
"mc1_stderr": 0.015659605755326923,
"mc2": 0.4367256901069689,
"mc2_stderr": 0.014358645276062254
},
"harness|winogrande|5": {
"acc": 0.8113654301499605,
"acc_stderr": 0.010995172318019806
},
"harness|gsm8k|5": {
"acc": 0.43442001516300227,
"acc_stderr": 0.013653507211411403
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
imperialwarrior/open-australian-legal-qa-paraphrased-hard-gemini-with-emb | imperialwarrior | "2024-03-10T09:22:13Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T09:01:31Z" | ---
dataset_info:
features:
- name: pipeline_1_result
dtype: string
- name: pipeline_1_result_r_embeddings
sequence: float64
- name: pipeline_1_result_nr_embeddings
sequence: float64
- name: pipeline_2_context
dtype: string
- name: pipeline_2_result
dtype: string
- name: pipeline_2_result_r_embeddings
sequence: float64
- name: pipeline_2_result_nr_embeddings
sequence: float64
- name: pipeline_3_context
dtype: string
- name: pipeline_3_result
dtype: string
- name: pipeline_3_result_r_embeddings
sequence: float64
- name: pipeline_3_result_nr_embeddings
sequence: float64
- name: pipeline_4_context
dtype: string
- name: pipeline_4_result
dtype: string
- name: pipeline_4_result_r_embeddings
sequence: float64
- name: pipeline_4_result_nr_embeddings
sequence: float64
- name: pipeline_5_context
dtype: string
- name: pipeline_5_result
dtype: string
- name: pipeline_5_result_r_embeddings
sequence: float64
- name: pipeline_5_result_nr_embeddings
sequence: float64
- name: pipeline_6_context
dtype: string
- name: pipeline_6_result
dtype: string
- name: pipeline_6_result_r_embeddings
sequence: float64
- name: pipeline_6_result_nr_embeddings
sequence: float64
- name: pipeline_7_context
dtype: string
- name: pipeline_7_result
dtype: string
- name: pipeline_7_result_r_embeddings
sequence: float64
- name: pipeline_7_result_nr_embeddings
sequence: float64
- name: referenced_question
dtype: string
- name: answer
dtype: string
- name: answer_non_retrieval_embeddings
dtype: string
- name: answer_retrieval_embeddings
dtype: string
- name: question
dtype: string
- name: question_retrieval_embeddings
dtype: string
- name: question_non_retrieval_embeddings
dtype: string
- name: __index_level_0__
dtype: float64
- name: case_index
dtype: float64
- name: pipeline_6_case_indexes
sequence: int64
- name: pipeline_7_case_indexes
sequence: int64
splits:
- name: train
num_bytes: 134099471
num_examples: 203
download_size: 32075501
dataset_size: 134099471
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
John2747/edu | John2747 | "2024-03-10T10:56:46Z" | 0 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-03-10T09:06:06Z" | ---
license: openrail
---
|