datasetId
stringlengths 5
121
| author
stringlengths 2
42
| last_modified
unknown | downloads
int64 0
28.8M
| likes
int64 0
5.87k
| tags
sequencelengths 1
7.92k
| task_categories
sequencelengths 0
40
⌀ | createdAt
unknown | card
stringlengths 19
977k
|
---|---|---|---|---|---|---|---|---|
thanujaifin/guanaco-llama2-1k | thanujaifin | "2024-03-10T09:09:20Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T09:09:19Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1654448
num_examples: 1000
download_size: 966692
dataset_size: 1654448
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_saltlux__luxia-21.4b-alignment-v1.0 | open-llm-leaderboard-old | "2024-03-11T05:41:05Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T09:10:41Z" | ---
pretty_name: Evaluation run of saltlux/luxia-21.4b-alignment-v1.0
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [saltlux/luxia-21.4b-alignment-v1.0](https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.0)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 3 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.0\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T05:38:36.114353](https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.0/blob/main/results_2024-03-11T05-38-36.114353.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6866913238774542,\n\
\ \"acc_stderr\": 0.03138668671631704,\n \"acc_norm\": 0.6865746717114842,\n\
\ \"acc_norm_stderr\": 0.03204310199162772,\n \"mc1\": 0.6523867809057528,\n\
\ \"mc1_stderr\": 0.01667076918889731,\n \"mc2\": 0.791656253485744,\n\
\ \"mc2_stderr\": 0.01329262162821789\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7627986348122867,\n \"acc_stderr\": 0.012430399829260851,\n\
\ \"acc_norm\": 0.7747440273037542,\n \"acc_norm_stderr\": 0.012207839995407314\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.8125871340370444,\n\
\ \"acc_stderr\": 0.0038944505016930363,\n \"acc_norm\": 0.9188408683529178,\n\
\ \"acc_norm_stderr\": 0.0027252124485788636\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6518518518518519,\n\
\ \"acc_stderr\": 0.041153246103369526,\n \"acc_norm\": 0.6518518518518519,\n\
\ \"acc_norm_stderr\": 0.041153246103369526\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7763157894736842,\n \"acc_stderr\": 0.03391160934343604,\n\
\ \"acc_norm\": 0.7763157894736842,\n \"acc_norm_stderr\": 0.03391160934343604\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7358490566037735,\n \"acc_stderr\": 0.027134291628741713,\n\
\ \"acc_norm\": 0.7358490566037735,\n \"acc_norm_stderr\": 0.027134291628741713\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.8472222222222222,\n\
\ \"acc_stderr\": 0.030085743248565666,\n \"acc_norm\": 0.8472222222222222,\n\
\ \"acc_norm_stderr\": 0.030085743248565666\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.61,\n \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n\
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.44,\n \"acc_stderr\": 0.049888765156985884,\n \
\ \"acc_norm\": 0.44,\n \"acc_norm_stderr\": 0.049888765156985884\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6647398843930635,\n\
\ \"acc_stderr\": 0.03599586301247077,\n \"acc_norm\": 0.6647398843930635,\n\
\ \"acc_norm_stderr\": 0.03599586301247077\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.81,\n \"acc_stderr\": 0.039427724440366234,\n \"acc_norm\": 0.81,\n\
\ \"acc_norm_stderr\": 0.039427724440366234\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6808510638297872,\n \"acc_stderr\": 0.030472973363380042,\n\
\ \"acc_norm\": 0.6808510638297872,\n \"acc_norm_stderr\": 0.030472973363380042\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5614035087719298,\n\
\ \"acc_stderr\": 0.04668000738510455,\n \"acc_norm\": 0.5614035087719298,\n\
\ \"acc_norm_stderr\": 0.04668000738510455\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6137931034482759,\n \"acc_stderr\": 0.04057324734419036,\n\
\ \"acc_norm\": 0.6137931034482759,\n \"acc_norm_stderr\": 0.04057324734419036\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5211640211640212,\n \"acc_stderr\": 0.025728230952130726,\n \"\
acc_norm\": 0.5211640211640212,\n \"acc_norm_stderr\": 0.025728230952130726\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.8354838709677419,\n\
\ \"acc_stderr\": 0.02109084774593932,\n \"acc_norm\": 0.8354838709677419,\n\
\ \"acc_norm_stderr\": 0.02109084774593932\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.6059113300492611,\n \"acc_stderr\": 0.034381579670365446,\n\
\ \"acc_norm\": 0.6059113300492611,\n \"acc_norm_stderr\": 0.034381579670365446\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.73,\n \"acc_stderr\": 0.0446196043338474,\n \"acc_norm\"\
: 0.73,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.8242424242424242,\n \"acc_stderr\": 0.02972094300622445,\n\
\ \"acc_norm\": 0.8242424242424242,\n \"acc_norm_stderr\": 0.02972094300622445\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8434343434343434,\n \"acc_stderr\": 0.025890520358141454,\n \"\
acc_norm\": 0.8434343434343434,\n \"acc_norm_stderr\": 0.025890520358141454\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8860103626943006,\n \"acc_stderr\": 0.02293514405391943,\n\
\ \"acc_norm\": 0.8860103626943006,\n \"acc_norm_stderr\": 0.02293514405391943\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.023234581088428494,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.023234581088428494\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.36666666666666664,\n \"acc_stderr\": 0.029381620726465076,\n\
\ \"acc_norm\": 0.36666666666666664,\n \"acc_norm_stderr\": 0.029381620726465076\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7605042016806722,\n \"acc_stderr\": 0.027722065493361262,\n\
\ \"acc_norm\": 0.7605042016806722,\n \"acc_norm_stderr\": 0.027722065493361262\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.4370860927152318,\n \"acc_stderr\": 0.04050035722230636,\n \"\
acc_norm\": 0.4370860927152318,\n \"acc_norm_stderr\": 0.04050035722230636\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8587155963302753,\n \"acc_stderr\": 0.014933868987028072,\n \"\
acc_norm\": 0.8587155963302753,\n \"acc_norm_stderr\": 0.014933868987028072\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5972222222222222,\n \"acc_stderr\": 0.03344887382997865,\n \"\
acc_norm\": 0.5972222222222222,\n \"acc_norm_stderr\": 0.03344887382997865\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8676470588235294,\n \"acc_stderr\": 0.02378429752091885,\n \"\
acc_norm\": 0.8676470588235294,\n \"acc_norm_stderr\": 0.02378429752091885\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8481012658227848,\n \"acc_stderr\": 0.023363878096632443,\n \
\ \"acc_norm\": 0.8481012658227848,\n \"acc_norm_stderr\": 0.023363878096632443\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7354260089686099,\n\
\ \"acc_stderr\": 0.029605103217038325,\n \"acc_norm\": 0.7354260089686099,\n\
\ \"acc_norm_stderr\": 0.029605103217038325\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6717557251908397,\n \"acc_stderr\": 0.04118438565806298,\n\
\ \"acc_norm\": 0.6717557251908397,\n \"acc_norm_stderr\": 0.04118438565806298\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8429752066115702,\n \"acc_stderr\": 0.03321244842547128,\n \"\
acc_norm\": 0.8429752066115702,\n \"acc_norm_stderr\": 0.03321244842547128\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.754601226993865,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.754601226993865,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8446601941747572,\n \"acc_stderr\": 0.03586594738573974,\n\
\ \"acc_norm\": 0.8446601941747572,\n \"acc_norm_stderr\": 0.03586594738573974\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8974358974358975,\n\
\ \"acc_stderr\": 0.019875655027867447,\n \"acc_norm\": 0.8974358974358975,\n\
\ \"acc_norm_stderr\": 0.019875655027867447\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294407,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294407\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7138728323699421,\n \"acc_stderr\": 0.02433214677913413,\n\
\ \"acc_norm\": 0.7138728323699421,\n \"acc_norm_stderr\": 0.02433214677913413\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4659217877094972,\n\
\ \"acc_stderr\": 0.016683615837486863,\n \"acc_norm\": 0.4659217877094972,\n\
\ \"acc_norm_stderr\": 0.016683615837486863\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7810457516339869,\n \"acc_stderr\": 0.02367908986180772,\n\
\ \"acc_norm\": 0.7810457516339869,\n \"acc_norm_stderr\": 0.02367908986180772\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7363344051446945,\n\
\ \"acc_stderr\": 0.02502553850053234,\n \"acc_norm\": 0.7363344051446945,\n\
\ \"acc_norm_stderr\": 0.02502553850053234\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7839506172839507,\n \"acc_stderr\": 0.02289916291844579,\n\
\ \"acc_norm\": 0.7839506172839507,\n \"acc_norm_stderr\": 0.02289916291844579\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5531914893617021,\n \"acc_stderr\": 0.029658235097666907,\n \
\ \"acc_norm\": 0.5531914893617021,\n \"acc_norm_stderr\": 0.029658235097666907\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.48370273794002605,\n\
\ \"acc_stderr\": 0.012763450734699817,\n \"acc_norm\": 0.48370273794002605,\n\
\ \"acc_norm_stderr\": 0.012763450734699817\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6838235294117647,\n \"acc_stderr\": 0.02824568739146293,\n\
\ \"acc_norm\": 0.6838235294117647,\n \"acc_norm_stderr\": 0.02824568739146293\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6813725490196079,\n \"acc_stderr\": 0.01885008469646872,\n \
\ \"acc_norm\": 0.6813725490196079,\n \"acc_norm_stderr\": 0.01885008469646872\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7510204081632653,\n \"acc_stderr\": 0.027682979522960238,\n\
\ \"acc_norm\": 0.7510204081632653,\n \"acc_norm_stderr\": 0.027682979522960238\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.035887028128263686,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.035887028128263686\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5542168674698795,\n\
\ \"acc_stderr\": 0.038695433234721015,\n \"acc_norm\": 0.5542168674698795,\n\
\ \"acc_norm_stderr\": 0.038695433234721015\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8187134502923976,\n \"acc_stderr\": 0.029547741687640038,\n\
\ \"acc_norm\": 0.8187134502923976,\n \"acc_norm_stderr\": 0.029547741687640038\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.6523867809057528,\n\
\ \"mc1_stderr\": 0.01667076918889731,\n \"mc2\": 0.791656253485744,\n\
\ \"mc2_stderr\": 0.01329262162821789\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8745067087608525,\n \"acc_stderr\": 0.009310542237486182\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6239575435936315,\n \
\ \"acc_stderr\": 0.013342532064849767\n }\n}\n```"
repo_url: https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.0
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|arc:challenge|25_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|arc:challenge|25_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|arc:challenge|25_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|gsm8k|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|gsm8k|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|gsm8k|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hellaswag|10_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hellaswag|10_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hellaswag|10_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T09-08-34.182758.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-38-26.378069.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-38-36.114353.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T05-38-36.114353.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- '**/details_harness|winogrande|5_2024-03-10T09-08-34.182758.parquet'
- split: 2024_03_11T05_38_26.378069
path:
- '**/details_harness|winogrande|5_2024-03-11T05-38-26.378069.parquet'
- split: 2024_03_11T05_38_36.114353
path:
- '**/details_harness|winogrande|5_2024-03-11T05-38-36.114353.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T05-38-36.114353.parquet'
- config_name: results
data_files:
- split: 2024_03_10T09_08_34.182758
path:
- results_2024-03-10T09-08-34.182758.parquet
- split: 2024_03_11T05_38_26.378069
path:
- results_2024-03-11T05-38-26.378069.parquet
- split: 2024_03_11T05_38_36.114353
path:
- results_2024-03-11T05-38-36.114353.parquet
- split: latest
path:
- results_2024-03-11T05-38-36.114353.parquet
---
# Dataset Card for Evaluation run of saltlux/luxia-21.4b-alignment-v1.0
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [saltlux/luxia-21.4b-alignment-v1.0](https://huggingface.co/saltlux/luxia-21.4b-alignment-v1.0) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 3 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.0",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T05:38:36.114353](https://huggingface.co/datasets/open-llm-leaderboard/details_saltlux__luxia-21.4b-alignment-v1.0/blob/main/results_2024-03-11T05-38-36.114353.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6866913238774542,
"acc_stderr": 0.03138668671631704,
"acc_norm": 0.6865746717114842,
"acc_norm_stderr": 0.03204310199162772,
"mc1": 0.6523867809057528,
"mc1_stderr": 0.01667076918889731,
"mc2": 0.791656253485744,
"mc2_stderr": 0.01329262162821789
},
"harness|arc:challenge|25": {
"acc": 0.7627986348122867,
"acc_stderr": 0.012430399829260851,
"acc_norm": 0.7747440273037542,
"acc_norm_stderr": 0.012207839995407314
},
"harness|hellaswag|10": {
"acc": 0.8125871340370444,
"acc_stderr": 0.0038944505016930363,
"acc_norm": 0.9188408683529178,
"acc_norm_stderr": 0.0027252124485788636
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6518518518518519,
"acc_stderr": 0.041153246103369526,
"acc_norm": 0.6518518518518519,
"acc_norm_stderr": 0.041153246103369526
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7763157894736842,
"acc_stderr": 0.03391160934343604,
"acc_norm": 0.7763157894736842,
"acc_norm_stderr": 0.03391160934343604
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7358490566037735,
"acc_stderr": 0.027134291628741713,
"acc_norm": 0.7358490566037735,
"acc_norm_stderr": 0.027134291628741713
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.8472222222222222,
"acc_stderr": 0.030085743248565666,
"acc_norm": 0.8472222222222222,
"acc_norm_stderr": 0.030085743248565666
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.44,
"acc_stderr": 0.049888765156985884,
"acc_norm": 0.44,
"acc_norm_stderr": 0.049888765156985884
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6647398843930635,
"acc_stderr": 0.03599586301247077,
"acc_norm": 0.6647398843930635,
"acc_norm_stderr": 0.03599586301247077
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.81,
"acc_stderr": 0.039427724440366234,
"acc_norm": 0.81,
"acc_norm_stderr": 0.039427724440366234
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6808510638297872,
"acc_stderr": 0.030472973363380042,
"acc_norm": 0.6808510638297872,
"acc_norm_stderr": 0.030472973363380042
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5614035087719298,
"acc_stderr": 0.04668000738510455,
"acc_norm": 0.5614035087719298,
"acc_norm_stderr": 0.04668000738510455
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6137931034482759,
"acc_stderr": 0.04057324734419036,
"acc_norm": 0.6137931034482759,
"acc_norm_stderr": 0.04057324734419036
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5211640211640212,
"acc_stderr": 0.025728230952130726,
"acc_norm": 0.5211640211640212,
"acc_norm_stderr": 0.025728230952130726
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.8354838709677419,
"acc_stderr": 0.02109084774593932,
"acc_norm": 0.8354838709677419,
"acc_norm_stderr": 0.02109084774593932
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.6059113300492611,
"acc_stderr": 0.034381579670365446,
"acc_norm": 0.6059113300492611,
"acc_norm_stderr": 0.034381579670365446
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.73,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.73,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.8242424242424242,
"acc_stderr": 0.02972094300622445,
"acc_norm": 0.8242424242424242,
"acc_norm_stderr": 0.02972094300622445
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8434343434343434,
"acc_stderr": 0.025890520358141454,
"acc_norm": 0.8434343434343434,
"acc_norm_stderr": 0.025890520358141454
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8860103626943006,
"acc_stderr": 0.02293514405391943,
"acc_norm": 0.8860103626943006,
"acc_norm_stderr": 0.02293514405391943
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.7,
"acc_stderr": 0.023234581088428494,
"acc_norm": 0.7,
"acc_norm_stderr": 0.023234581088428494
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.36666666666666664,
"acc_stderr": 0.029381620726465076,
"acc_norm": 0.36666666666666664,
"acc_norm_stderr": 0.029381620726465076
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7605042016806722,
"acc_stderr": 0.027722065493361262,
"acc_norm": 0.7605042016806722,
"acc_norm_stderr": 0.027722065493361262
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.4370860927152318,
"acc_stderr": 0.04050035722230636,
"acc_norm": 0.4370860927152318,
"acc_norm_stderr": 0.04050035722230636
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8587155963302753,
"acc_stderr": 0.014933868987028072,
"acc_norm": 0.8587155963302753,
"acc_norm_stderr": 0.014933868987028072
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.03344887382997865,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.03344887382997865
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8676470588235294,
"acc_stderr": 0.02378429752091885,
"acc_norm": 0.8676470588235294,
"acc_norm_stderr": 0.02378429752091885
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8481012658227848,
"acc_stderr": 0.023363878096632443,
"acc_norm": 0.8481012658227848,
"acc_norm_stderr": 0.023363878096632443
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7354260089686099,
"acc_stderr": 0.029605103217038325,
"acc_norm": 0.7354260089686099,
"acc_norm_stderr": 0.029605103217038325
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6717557251908397,
"acc_stderr": 0.04118438565806298,
"acc_norm": 0.6717557251908397,
"acc_norm_stderr": 0.04118438565806298
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8429752066115702,
"acc_stderr": 0.03321244842547128,
"acc_norm": 0.8429752066115702,
"acc_norm_stderr": 0.03321244842547128
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.754601226993865,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.754601226993865,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.8446601941747572,
"acc_stderr": 0.03586594738573974,
"acc_norm": 0.8446601941747572,
"acc_norm_stderr": 0.03586594738573974
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8974358974358975,
"acc_stderr": 0.019875655027867447,
"acc_norm": 0.8974358974358975,
"acc_norm_stderr": 0.019875655027867447
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294407,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294407
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7138728323699421,
"acc_stderr": 0.02433214677913413,
"acc_norm": 0.7138728323699421,
"acc_norm_stderr": 0.02433214677913413
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4659217877094972,
"acc_stderr": 0.016683615837486863,
"acc_norm": 0.4659217877094972,
"acc_norm_stderr": 0.016683615837486863
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7810457516339869,
"acc_stderr": 0.02367908986180772,
"acc_norm": 0.7810457516339869,
"acc_norm_stderr": 0.02367908986180772
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7363344051446945,
"acc_stderr": 0.02502553850053234,
"acc_norm": 0.7363344051446945,
"acc_norm_stderr": 0.02502553850053234
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7839506172839507,
"acc_stderr": 0.02289916291844579,
"acc_norm": 0.7839506172839507,
"acc_norm_stderr": 0.02289916291844579
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5531914893617021,
"acc_stderr": 0.029658235097666907,
"acc_norm": 0.5531914893617021,
"acc_norm_stderr": 0.029658235097666907
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.48370273794002605,
"acc_stderr": 0.012763450734699817,
"acc_norm": 0.48370273794002605,
"acc_norm_stderr": 0.012763450734699817
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6838235294117647,
"acc_stderr": 0.02824568739146293,
"acc_norm": 0.6838235294117647,
"acc_norm_stderr": 0.02824568739146293
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.01885008469646872,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.01885008469646872
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7510204081632653,
"acc_stderr": 0.027682979522960238,
"acc_norm": 0.7510204081632653,
"acc_norm_stderr": 0.027682979522960238
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.035887028128263686,
"acc_norm": 0.85,
"acc_norm_stderr": 0.035887028128263686
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5542168674698795,
"acc_stderr": 0.038695433234721015,
"acc_norm": 0.5542168674698795,
"acc_norm_stderr": 0.038695433234721015
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8187134502923976,
"acc_stderr": 0.029547741687640038,
"acc_norm": 0.8187134502923976,
"acc_norm_stderr": 0.029547741687640038
},
"harness|truthfulqa:mc|0": {
"mc1": 0.6523867809057528,
"mc1_stderr": 0.01667076918889731,
"mc2": 0.791656253485744,
"mc2_stderr": 0.01329262162821789
},
"harness|winogrande|5": {
"acc": 0.8745067087608525,
"acc_stderr": 0.009310542237486182
},
"harness|gsm8k|5": {
"acc": 0.6239575435936315,
"acc_stderr": 0.013342532064849767
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
0x7o/klara-voice | 0x7o | "2024-03-10T09:45:01Z" | 0 | 3 | [
"task_categories:text-to-speech",
"language:ru",
"license:mit",
"size_categories:10K<n<100K",
"format:parquet",
"modality:audio",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"text-to-speech"
] | "2024-03-10T09:32:38Z" | ---
dataset_info:
features:
- name: audio
dtype: audio
- name: text
dtype: string
splits:
- name: train
num_bytes: 8736190036.812
num_examples: 29142
download_size: 10256327673
dataset_size: 8736190036.812
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: mit
task_categories:
- text-to-speech
language:
- ru
size_categories:
- 10K<n<100K
---
# Klara Voice Dataset
| | |
| --- | --- |
|Total length of dataset: | 32.57 hours |
|Average audio length:| 4.02 seconds|
|Maximum length:| 22.52 seconds| |
imperialwarrior/open-australian-legal-qa-paraphrased-easy-gemini-with-emb | imperialwarrior | "2024-03-10T09:39:18Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T09:39:10Z" | ---
dataset_info:
features:
- name: pipeline_1_result
dtype: string
- name: pipeline_1_result_r_embeddings
sequence: float64
- name: pipeline_1_result_nr_embeddings
sequence: float64
- name: pipeline_2_context
dtype: string
- name: pipeline_2_result
dtype: string
- name: pipeline_2_result_r_embeddings
sequence: float64
- name: pipeline_2_result_nr_embeddings
sequence: float64
- name: pipeline_3_context
dtype: string
- name: pipeline_3_result
dtype: string
- name: pipeline_3_result_r_embeddings
sequence: float64
- name: pipeline_3_result_nr_embeddings
sequence: float64
- name: pipeline_4_context
dtype: string
- name: pipeline_4_result
dtype: string
- name: pipeline_4_result_r_embeddings
sequence: float64
- name: pipeline_4_result_nr_embeddings
sequence: float64
- name: pipeline_5_context
dtype: string
- name: pipeline_5_result
dtype: string
- name: pipeline_5_result_r_embeddings
sequence: float64
- name: pipeline_5_result_nr_embeddings
sequence: float64
- name: pipeline_6_context
dtype: string
- name: pipeline_6_result
dtype: string
- name: pipeline_6_result_r_embeddings
sequence: float64
- name: pipeline_6_result_nr_embeddings
sequence: float64
- name: pipeline_7_context
dtype: string
- name: pipeline_7_result
dtype: string
- name: pipeline_7_result_r_embeddings
sequence: float64
- name: pipeline_7_result_nr_embeddings
sequence: float64
- name: referenced_question
dtype: string
- name: answer
dtype: string
- name: answer_non_retrieval_embeddings
dtype: string
- name: answer_retrieval_embeddings
dtype: string
- name: question
dtype: string
- name: question_retrieval_embeddings
dtype: string
- name: question_non_retrieval_embeddings
dtype: string
- name: __index_level_0__
dtype: float64
- name: case_index
dtype: float64
- name: pipeline_6_case_indexes
sequence: int64
- name: pipeline_7_case_indexes
sequence: int64
splits:
- name: train
num_bytes: 136671259
num_examples: 207
download_size: 32364885
dataset_size: 136671259
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
jmarmier/scs-phase-iii | jmarmier | "2024-03-10T10:11:00Z" | 0 | 0 | [
"license:mit",
"size_categories:1K<n<10K",
"format:csv",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T09:56:35Z" | ---
license: mit
---
|
AWeirdDev/bill-wurtz | AWeirdDev | "2024-03-10T10:34:32Z" | 0 | 0 | [
"task_categories:question-answering",
"task_categories:text-generation",
"task_categories:text2text-generation",
"language:en",
"license:apache-2.0",
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"psychology",
"philosophy"
] | [
"question-answering",
"text-generation",
"text2text-generation"
] | "2024-03-10T10:00:33Z" | ---
dataset_info:
features:
- name: link
dtype: string
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 26081145
num_examples: 129362
download_size: 11920936
dataset_size: 26081145
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
license: apache-2.0
task_categories:
- question-answering
- text-generation
- text2text-generation
language:
- en
tags:
- psychology
- philosophy
pretty_name: Bill Wurtz Q&A
size_categories:
- 100K<n<1M
---
<div align="center">
<img alt="hi huggingface banner"
src="https://cdn-uploads.huggingface.co/production/uploads/640739e3a5e2ff2832ead08b/uO4HuXeoXgd0aQQ2t6Zhw.png"
/>
</div>
<br />
# bill-wurtz
All questions Bill Wurtz answers on [billwurtz.com/questions](https://billwurtz.com/questions/questions.html). I think they're pretty humorous.
- 🐣 Fetched on: 2024-3-10 (Mar 10th)
- 🍕 For tasks: `text-generation`, `question-answering`, + more
- 📜 Rows: `129,362` (129k)
```python
DatasetDict({
train: Dataset({
features: ['link', 'question', 'answer'],
num_rows: 129362
})
})
```
## Use This Dataset
Download with [🤗 Datasets](https://pypi.org/project/datasets):
```python
from datasets import load_dataset
dataset = load_dataset("AWeirdDev/bill-wurtz")
dataset["train"][0]
# => { "link": "...", "question": "your opinion on ceilings?", "answer": "incredible" }
```
<details>
<summary><b>🧹 Cleaning the dataset</b></summary>
<p>
Some questions/answers may be blank. Clean the dataset before you use it.
```python
from datasets import Dataset
raw_dataset = dataset["train"].to_list()
for i, d in enumerate(raw_dataset):
if not d['question'].strip() or not d['answer'].strip():
del raw_dataset[i]
raw_dataset = Dataset.from_list(raw_dataset)
raw_dataset
# Dataset({
# features: ['link', 'question', 'answer'],
# num_rows: 123922
# })
```
</p>
</details>
|
allandclive/MakerereRadioSpeech_20Hrs | allandclive | "2024-03-10T10:58:16Z" | 0 | 0 | [
"task_categories:automatic-speech-recognition",
"language:lg",
"region:us"
] | [
"automatic-speech-recognition"
] | "2024-03-10T10:01:11Z" | ---
task_categories:
- automatic-speech-recognition
language:
- lg
---
# Dataset Card for Dataset Name
Luganda Radio Speech dataset, 20 hours of human transcribed radio speech.The audio is 16kHZ, mono channel and with 16 bit rate. cleaned.csv contains cleaned transcripts and uncleaned.csv contains uncleaned transcripts
|
oneonlee/cleansed_emocontext | oneonlee | "2024-03-10T10:24:36Z" | 0 | 1 | [
"task_categories:text-classification",
"task_ids:sentiment-classification",
"annotations_creators:expert-generated",
"language_creators:crowdsourced",
"source_datasets:emo",
"language:en",
"license:mpl-2.0",
"size_categories:10K<n<100K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"conversation"
] | [
"text-classification"
] | "2024-03-10T10:18:30Z" | ---
annotations_creators:
- expert-generated
language_creators:
- crowdsourced
license: mpl-2.0
task_categories:
- text-classification
task_ids:
- sentiment-classification
language:
- en
tags:
- conversation
size_categories:
- 10K<n<100K
source_datasets:
- emo
pretty_name: Cleansed_EmoContext
dataset_info:
features:
- name: turn1
dtype: string
- name: turn2
dtype: string
- name: turn3
dtype: string
- name: label
dtype:
class_label:
names:
"0": others
"1": happy
"2": sad
"3": angry
config_name: cleansed_emo2019
# splits:
# - name: train
# num_bytes: 2433205
# num_examples: 30160
# - name: test
# num_bytes: 421555
# num_examples: 5509
# download_size: 3362556
# dataset_size: 2854760
---
# Dataset Card for "cleansed_emocontext"
- `cleansed_emocontext` is a **cleansed and normalized version** of [`emo`](https://huggingface.co/datasets/emo).
- For cleansing and normalization, [`data_cleansing.py`](https://github.com/oneonlee/cleansed_emocontext/blob/master/helpers/data_cleaning.py) was used, [modifying the code](https://github.com/oneonlee/cleansed_emocontext/commit/c09b020dfb49692a1c5fcd2099d531503d9bb8b5#diff-266912260148f110c4e7fe00b6cdef4c23b024dca8c693a0dd3c83f25ba56f54) provided on the [official EmoContext GitHub](https://github.com/DhruvDh/emocontext).
## Table of Contents
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Supported Tasks and Leaderboards](#supported-tasks-and-leaderboards)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Data Instances](#data-instances)
- [Data Fields](#data-fields)
- [Data Splits](#data-splits)
- [Dataset Creation](#dataset-creation)
- [Curation Rationale](#curation-rationale)
- [Source Data](#source-data)
- [Annotations](#annotations)
- [Personal and Sensitive Information](#personal-and-sensitive-information)
- [Considerations for Using the Data](#considerations-for-using-the-data)
- [Social Impact of Dataset](#social-impact-of-dataset)
- [Discussion of Biases](#discussion-of-biases)
- [Other Known Limitations](#other-known-limitations)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
- [Licensing Information](#licensing-information)
- [Citation Information](#citation-information)
- [Contributions](#contributions)
## Dataset Description
- **Homepage:** [SemEval-2019 Task 3: EmoContext Contextual Emotion Detection in Text](https://aclanthology.org/S19-2005/)
- **Repository:** [More Information Needed](https://github.com/DhruvDh/emocontext)
- **Paper:** [SemEval-2019 Task 3: EmoContext Contextual Emotion Detection in Text](https://aclanthology.org/S19-2005/)
- **Point of Contact:** [More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
- **Size of downloaded dataset files:** 3.37 MB
- **Size of the generated dataset:** 2.85 MB
- **Total amount of disk used:** 6.22 MB
### Dataset Summary
In this dataset, given a textual dialogue i.e. an utterance along with two previous turns of context, the goal was to infer the underlying emotion of the utterance by choosing from four emotion classes - Happy, Sad, Angry and Others.
### Supported Tasks and Leaderboards
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Languages
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Dataset Structure
### Data Instances
#### cleansed_emo2019
An example of 'train' looks as follows.
```
{
"label": 0,
"turn1": "don't worry i'm girl",
"turn2": "hmm how do i know if you are",
"turn3": "what's your name ?"
}
```
### Data Fields
The data fields are the same among all splits.
#### cleansed_emo2019
- `turn1`, `turn2`, `turn3`: a `string` feature.
- `label`: a classification label, with possible values including `others` (0), `happy` (1), `sad` (2), `angry` (3).
### Data Splits
| name | train | dev | test |
| ---------------- | ----: | ---: | ---: |
| cleansed_emo2019 | 30160 | 2755 | 5509 |
## Dataset Creation
### Curation Rationale
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Source Data
#### Initial Data Collection and Normalization
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the source language producers?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Annotations
#### Annotation process
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
#### Who are the annotators?
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Personal and Sensitive Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Considerations for Using the Data
### Social Impact of Dataset
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Discussion of Biases
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Other Known Limitations
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
## Additional Information
### Dataset Curators
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Licensing Information
[More Information Needed](https://github.com/huggingface/datasets/blob/master/CONTRIBUTING.md#how-to-contribute-to-the-dataset-cards)
### Citation Information
```
@inproceedings{chatterjee-etal-2019-semeval,
title={SemEval-2019 Task 3: EmoContext Contextual Emotion Detection in Text},
author={Ankush Chatterjee and Kedhar Nath Narahari and Meghana Joshi and Puneet Agrawal},
booktitle={Proceedings of the 13th International Workshop on Semantic Evaluation},
year={2019},
address={Minneapolis, Minnesota, USA},
publisher={Association for Computational Linguistics},
url={https://www.aclweb.org/anthology/S19-2005},
doi={10.18653/v1/S19-2005},
pages={39--48},
abstract={In this paper, we present the SemEval-2019 Task 3 - EmoContext: Contextual Emotion Detection in Text. Lack of facial expressions and voice modulations make detecting emotions in text a challenging problem. For instance, as humans, on reading ''Why don't you ever text me!'' we can either interpret it as a sad or angry emotion and the same ambiguity exists for machines. However, the context of dialogue can prove helpful in detection of the emotion. In this task, given a textual dialogue i.e. an utterance along with two previous turns of context, the goal was to infer the underlying emotion of the utterance by choosing from four emotion classes - Happy, Sad, Angry and Others. To facilitate the participation in this task, textual dialogues from user interaction with a conversational agent were taken and annotated for emotion classes after several data processing steps. A training data set of 30160 dialogues, and two evaluation data sets, Test1 and Test2, containing 2755 and 5509 dialogues respectively were released to the participants. A total of 311 teams made submissions to this task. The final leader-board was evaluated on Test2 data set, and the highest ranked submission achieved 79.59 micro-averaged F1 score. Our analysis of systems submitted to the task indicate that Bi-directional LSTM was the most common choice of neural architecture used, and most of the systems had the best performance for the Sad emotion class, and the worst for the Happy emotion class}
}
```
|
Hardik1234/reactjs | Hardik1234 | "2024-03-10T11:00:26Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T10:26:35Z" | ---
dataset_info:
features:
- name: path
dtype: string
- name: repo_name
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 2054642846
num_examples: 512984
download_size: 778252919
dataset_size: 2054642846
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
hanesh007/mtdataset_exp | hanesh007 | "2024-03-11T02:31:44Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T10:39:26Z" | ---
license: apache-2.0
---
|
saimdev/william2 | saimdev | "2024-03-10T10:41:32Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T10:41:27Z" | ---
dataset_info:
features:
- name: image
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 27817
num_examples: 605
download_size: 6276
dataset_size: 27817
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_frankenmerger__gemoy-4b-instruct | open-llm-leaderboard-old | "2024-03-10T11:01:39Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:01:18Z" | ---
pretty_name: Evaluation run of frankenmerger/gemoy-4b-instruct
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/gemoy-4b-instruct](https://huggingface.co/frankenmerger/gemoy-4b-instruct)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T10:59:13.672299](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct/blob/main/results_2024-03-10T10-59-13.672299.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3635342339637508,\n\
\ \"acc_stderr\": 0.03346560799526674,\n \"acc_norm\": 0.36857377594697643,\n\
\ \"acc_norm_stderr\": 0.03436928129673128,\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006518,\n \"mc2\": 0.46641168216975853,\n\
\ \"mc2_stderr\": 0.016269583261373614\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3728668941979522,\n \"acc_stderr\": 0.014131176760131167,\n\
\ \"acc_norm\": 0.4069965870307167,\n \"acc_norm_stderr\": 0.01435639941800913\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.44981079466241786,\n\
\ \"acc_stderr\": 0.004964579685712441,\n \"acc_norm\": 0.5802628958374826,\n\
\ \"acc_norm_stderr\": 0.004925072159723828\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816507,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816507\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.362962962962963,\n\
\ \"acc_stderr\": 0.041539484047424,\n \"acc_norm\": 0.362962962962963,\n\
\ \"acc_norm_stderr\": 0.041539484047424\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.32894736842105265,\n \"acc_stderr\": 0.038234289699266046,\n\
\ \"acc_norm\": 0.32894736842105265,\n \"acc_norm_stderr\": 0.038234289699266046\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.52,\n\
\ \"acc_stderr\": 0.05021167315686781,\n \"acc_norm\": 0.52,\n \
\ \"acc_norm_stderr\": 0.05021167315686781\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.4075471698113208,\n \"acc_stderr\": 0.030242233800854494,\n\
\ \"acc_norm\": 0.4075471698113208,\n \"acc_norm_stderr\": 0.030242233800854494\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.039420826399272135,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.039420826399272135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.24,\n \"acc_stderr\": 0.042923469599092816,\n \
\ \"acc_norm\": 0.24,\n \"acc_norm_stderr\": 0.042923469599092816\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.28,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\"\
: 0.28,\n \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2832369942196532,\n\
\ \"acc_stderr\": 0.034355680560478746,\n \"acc_norm\": 0.2832369942196532,\n\
\ \"acc_norm_stderr\": 0.034355680560478746\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617747,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617747\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.46,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\": 0.46,\n\
\ \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.34893617021276596,\n \"acc_stderr\": 0.031158522131357783,\n\
\ \"acc_norm\": 0.34893617021276596,\n \"acc_norm_stderr\": 0.031158522131357783\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2631578947368421,\n\
\ \"acc_stderr\": 0.04142439719489362,\n \"acc_norm\": 0.2631578947368421,\n\
\ \"acc_norm_stderr\": 0.04142439719489362\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.4413793103448276,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.4413793103448276,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2830687830687831,\n \"acc_stderr\": 0.023201392938194974,\n \"\
acc_norm\": 0.2830687830687831,\n \"acc_norm_stderr\": 0.023201392938194974\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.25396825396825395,\n\
\ \"acc_stderr\": 0.03893259610604674,\n \"acc_norm\": 0.25396825396825395,\n\
\ \"acc_norm_stderr\": 0.03893259610604674\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.35161290322580646,\n\
\ \"acc_stderr\": 0.027162537826948458,\n \"acc_norm\": 0.35161290322580646,\n\
\ \"acc_norm_stderr\": 0.027162537826948458\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22660098522167488,\n \"acc_stderr\": 0.029454863835292992,\n\
\ \"acc_norm\": 0.22660098522167488,\n \"acc_norm_stderr\": 0.029454863835292992\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \"acc_norm\"\
: 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.40606060606060607,\n \"acc_stderr\": 0.03834816355401181,\n\
\ \"acc_norm\": 0.40606060606060607,\n \"acc_norm_stderr\": 0.03834816355401181\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.4797979797979798,\n \"acc_stderr\": 0.035594435655639196,\n \"\
acc_norm\": 0.4797979797979798,\n \"acc_norm_stderr\": 0.035594435655639196\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47150259067357514,\n \"acc_stderr\": 0.036025735712884414,\n\
\ \"acc_norm\": 0.47150259067357514,\n \"acc_norm_stderr\": 0.036025735712884414\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.3487179487179487,\n \"acc_stderr\": 0.024162780284017717,\n\
\ \"acc_norm\": 0.3487179487179487,\n \"acc_norm_stderr\": 0.024162780284017717\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.21481481481481482,\n \"acc_stderr\": 0.025040443877000683,\n \
\ \"acc_norm\": 0.21481481481481482,\n \"acc_norm_stderr\": 0.025040443877000683\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.31932773109243695,\n \"acc_stderr\": 0.030283995525884396,\n\
\ \"acc_norm\": 0.31932773109243695,\n \"acc_norm_stderr\": 0.030283995525884396\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45504587155963305,\n \"acc_stderr\": 0.021350503090925167,\n \"\
acc_norm\": 0.45504587155963305,\n \"acc_norm_stderr\": 0.021350503090925167\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.22685185185185186,\n \"acc_stderr\": 0.02856165010242226,\n \"\
acc_norm\": 0.22685185185185186,\n \"acc_norm_stderr\": 0.02856165010242226\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.034542365853806094,\n \"\
acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.034542365853806094\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.45569620253164556,\n \"acc_stderr\": 0.03241920684693334,\n \
\ \"acc_norm\": 0.45569620253164556,\n \"acc_norm_stderr\": 0.03241920684693334\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.35874439461883406,\n\
\ \"acc_stderr\": 0.03219079200419997,\n \"acc_norm\": 0.35874439461883406,\n\
\ \"acc_norm_stderr\": 0.03219079200419997\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.4122137404580153,\n \"acc_stderr\": 0.04317171194870254,\n\
\ \"acc_norm\": 0.4122137404580153,\n \"acc_norm_stderr\": 0.04317171194870254\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5537190082644629,\n \"acc_stderr\": 0.0453793517794788,\n \"acc_norm\"\
: 0.5537190082644629,\n \"acc_norm_stderr\": 0.0453793517794788\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.46296296296296297,\n\
\ \"acc_stderr\": 0.04820403072760627,\n \"acc_norm\": 0.46296296296296297,\n\
\ \"acc_norm_stderr\": 0.04820403072760627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.3312883435582822,\n \"acc_stderr\": 0.03697983910025588,\n\
\ \"acc_norm\": 0.3312883435582822,\n \"acc_norm_stderr\": 0.03697983910025588\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285712,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285712\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.4368932038834951,\n \"acc_stderr\": 0.04911147107365777,\n\
\ \"acc_norm\": 0.4368932038834951,\n \"acc_norm_stderr\": 0.04911147107365777\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.5769230769230769,\n\
\ \"acc_stderr\": 0.032366121762202014,\n \"acc_norm\": 0.5769230769230769,\n\
\ \"acc_norm_stderr\": 0.032366121762202014\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542127\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.421455938697318,\n\
\ \"acc_stderr\": 0.017657976412654857,\n \"acc_norm\": 0.421455938697318,\n\
\ \"acc_norm_stderr\": 0.017657976412654857\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.44508670520231214,\n \"acc_stderr\": 0.02675625512966377,\n\
\ \"acc_norm\": 0.44508670520231214,\n \"acc_norm_stderr\": 0.02675625512966377\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24916201117318434,\n\
\ \"acc_stderr\": 0.01446589382985993,\n \"acc_norm\": 0.24916201117318434,\n\
\ \"acc_norm_stderr\": 0.01446589382985993\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.46405228758169936,\n \"acc_stderr\": 0.02855582751652879,\n\
\ \"acc_norm\": 0.46405228758169936,\n \"acc_norm_stderr\": 0.02855582751652879\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3665594855305466,\n\
\ \"acc_stderr\": 0.02736807824397163,\n \"acc_norm\": 0.3665594855305466,\n\
\ \"acc_norm_stderr\": 0.02736807824397163\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.4166666666666667,\n \"acc_stderr\": 0.02743162372241502,\n\
\ \"acc_norm\": 0.4166666666666667,\n \"acc_norm_stderr\": 0.02743162372241502\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2907801418439716,\n \"acc_stderr\": 0.027090664368353178,\n \
\ \"acc_norm\": 0.2907801418439716,\n \"acc_norm_stderr\": 0.027090664368353178\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3259452411994785,\n\
\ \"acc_stderr\": 0.01197150729498278,\n \"acc_norm\": 0.3259452411994785,\n\
\ \"acc_norm_stderr\": 0.01197150729498278\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.25735294117647056,\n \"acc_stderr\": 0.026556519470041513,\n\
\ \"acc_norm\": 0.25735294117647056,\n \"acc_norm_stderr\": 0.026556519470041513\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3545751633986928,\n \"acc_stderr\": 0.019353360547553714,\n \
\ \"acc_norm\": 0.3545751633986928,\n \"acc_norm_stderr\": 0.019353360547553714\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.38181818181818183,\n\
\ \"acc_stderr\": 0.04653429807913509,\n \"acc_norm\": 0.38181818181818183,\n\
\ \"acc_norm_stderr\": 0.04653429807913509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5020408163265306,\n \"acc_stderr\": 0.0320089533497105,\n\
\ \"acc_norm\": 0.5020408163265306,\n \"acc_norm_stderr\": 0.0320089533497105\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.3781094527363184,\n\
\ \"acc_stderr\": 0.03428867848778657,\n \"acc_norm\": 0.3781094527363184,\n\
\ \"acc_norm_stderr\": 0.03428867848778657\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.58,\n \"acc_stderr\": 0.04960449637488583,\n \
\ \"acc_norm\": 0.58,\n \"acc_norm_stderr\": 0.04960449637488583\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.39156626506024095,\n\
\ \"acc_stderr\": 0.03799857454479636,\n \"acc_norm\": 0.39156626506024095,\n\
\ \"acc_norm_stderr\": 0.03799857454479636\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.4152046783625731,\n \"acc_stderr\": 0.03779275945503201,\n\
\ \"acc_norm\": 0.4152046783625731,\n \"acc_norm_stderr\": 0.03779275945503201\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2729498164014688,\n\
\ \"mc1_stderr\": 0.015594753632006518,\n \"mc2\": 0.46641168216975853,\n\
\ \"mc2_stderr\": 0.016269583261373614\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5943172849250198,\n \"acc_stderr\": 0.013800206336014203\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.0,\n \"acc_stderr\"\
: 0.0\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/gemoy-4b-instruct
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|arc:challenge|25_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|gsm8k|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hellaswag|10_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T10-59-13.672299.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- '**/details_harness|winogrande|5_2024-03-10T10-59-13.672299.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T10-59-13.672299.parquet'
- config_name: results
data_files:
- split: 2024_03_10T10_59_13.672299
path:
- results_2024-03-10T10-59-13.672299.parquet
- split: latest
path:
- results_2024-03-10T10-59-13.672299.parquet
---
# Dataset Card for Evaluation run of frankenmerger/gemoy-4b-instruct
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/gemoy-4b-instruct](https://huggingface.co/frankenmerger/gemoy-4b-instruct) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T10:59:13.672299](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct/blob/main/results_2024-03-10T10-59-13.672299.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3635342339637508,
"acc_stderr": 0.03346560799526674,
"acc_norm": 0.36857377594697643,
"acc_norm_stderr": 0.03436928129673128,
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006518,
"mc2": 0.46641168216975853,
"mc2_stderr": 0.016269583261373614
},
"harness|arc:challenge|25": {
"acc": 0.3728668941979522,
"acc_stderr": 0.014131176760131167,
"acc_norm": 0.4069965870307167,
"acc_norm_stderr": 0.01435639941800913
},
"harness|hellaswag|10": {
"acc": 0.44981079466241786,
"acc_stderr": 0.004964579685712441,
"acc_norm": 0.5802628958374826,
"acc_norm_stderr": 0.004925072159723828
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.362962962962963,
"acc_stderr": 0.041539484047424,
"acc_norm": 0.362962962962963,
"acc_norm_stderr": 0.041539484047424
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.32894736842105265,
"acc_stderr": 0.038234289699266046,
"acc_norm": 0.32894736842105265,
"acc_norm_stderr": 0.038234289699266046
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.52,
"acc_stderr": 0.05021167315686781,
"acc_norm": 0.52,
"acc_norm_stderr": 0.05021167315686781
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.4075471698113208,
"acc_stderr": 0.030242233800854494,
"acc_norm": 0.4075471698113208,
"acc_norm_stderr": 0.030242233800854494
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.039420826399272135,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.039420826399272135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.24,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.24,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.034355680560478746,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.034355680560478746
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617747,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617747
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.46,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.46,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.34893617021276596,
"acc_stderr": 0.031158522131357783,
"acc_norm": 0.34893617021276596,
"acc_norm_stderr": 0.031158522131357783
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2631578947368421,
"acc_stderr": 0.04142439719489362,
"acc_norm": 0.2631578947368421,
"acc_norm_stderr": 0.04142439719489362
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.4413793103448276,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.4413793103448276,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2830687830687831,
"acc_stderr": 0.023201392938194974,
"acc_norm": 0.2830687830687831,
"acc_norm_stderr": 0.023201392938194974
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.25396825396825395,
"acc_stderr": 0.03893259610604674,
"acc_norm": 0.25396825396825395,
"acc_norm_stderr": 0.03893259610604674
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.35161290322580646,
"acc_stderr": 0.027162537826948458,
"acc_norm": 0.35161290322580646,
"acc_norm_stderr": 0.027162537826948458
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22660098522167488,
"acc_stderr": 0.029454863835292992,
"acc_norm": 0.22660098522167488,
"acc_norm_stderr": 0.029454863835292992
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.40606060606060607,
"acc_stderr": 0.03834816355401181,
"acc_norm": 0.40606060606060607,
"acc_norm_stderr": 0.03834816355401181
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.4797979797979798,
"acc_stderr": 0.035594435655639196,
"acc_norm": 0.4797979797979798,
"acc_norm_stderr": 0.035594435655639196
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47150259067357514,
"acc_stderr": 0.036025735712884414,
"acc_norm": 0.47150259067357514,
"acc_norm_stderr": 0.036025735712884414
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.3487179487179487,
"acc_stderr": 0.024162780284017717,
"acc_norm": 0.3487179487179487,
"acc_norm_stderr": 0.024162780284017717
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.21481481481481482,
"acc_stderr": 0.025040443877000683,
"acc_norm": 0.21481481481481482,
"acc_norm_stderr": 0.025040443877000683
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.31932773109243695,
"acc_stderr": 0.030283995525884396,
"acc_norm": 0.31932773109243695,
"acc_norm_stderr": 0.030283995525884396
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45504587155963305,
"acc_stderr": 0.021350503090925167,
"acc_norm": 0.45504587155963305,
"acc_norm_stderr": 0.021350503090925167
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.22685185185185186,
"acc_stderr": 0.02856165010242226,
"acc_norm": 0.22685185185185186,
"acc_norm_stderr": 0.02856165010242226
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.45569620253164556,
"acc_stderr": 0.03241920684693334,
"acc_norm": 0.45569620253164556,
"acc_norm_stderr": 0.03241920684693334
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.35874439461883406,
"acc_stderr": 0.03219079200419997,
"acc_norm": 0.35874439461883406,
"acc_norm_stderr": 0.03219079200419997
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.4122137404580153,
"acc_stderr": 0.04317171194870254,
"acc_norm": 0.4122137404580153,
"acc_norm_stderr": 0.04317171194870254
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5537190082644629,
"acc_stderr": 0.0453793517794788,
"acc_norm": 0.5537190082644629,
"acc_norm_stderr": 0.0453793517794788
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.04820403072760627,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.04820403072760627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.3312883435582822,
"acc_stderr": 0.03697983910025588,
"acc_norm": 0.3312883435582822,
"acc_norm_stderr": 0.03697983910025588
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285712,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285712
},
"harness|hendrycksTest-management|5": {
"acc": 0.4368932038834951,
"acc_stderr": 0.04911147107365777,
"acc_norm": 0.4368932038834951,
"acc_norm_stderr": 0.04911147107365777
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.5769230769230769,
"acc_stderr": 0.032366121762202014,
"acc_norm": 0.5769230769230769,
"acc_norm_stderr": 0.032366121762202014
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.421455938697318,
"acc_stderr": 0.017657976412654857,
"acc_norm": 0.421455938697318,
"acc_norm_stderr": 0.017657976412654857
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.44508670520231214,
"acc_stderr": 0.02675625512966377,
"acc_norm": 0.44508670520231214,
"acc_norm_stderr": 0.02675625512966377
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24916201117318434,
"acc_stderr": 0.01446589382985993,
"acc_norm": 0.24916201117318434,
"acc_norm_stderr": 0.01446589382985993
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.46405228758169936,
"acc_stderr": 0.02855582751652879,
"acc_norm": 0.46405228758169936,
"acc_norm_stderr": 0.02855582751652879
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3665594855305466,
"acc_stderr": 0.02736807824397163,
"acc_norm": 0.3665594855305466,
"acc_norm_stderr": 0.02736807824397163
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.02743162372241502,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.02743162372241502
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2907801418439716,
"acc_stderr": 0.027090664368353178,
"acc_norm": 0.2907801418439716,
"acc_norm_stderr": 0.027090664368353178
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3259452411994785,
"acc_stderr": 0.01197150729498278,
"acc_norm": 0.3259452411994785,
"acc_norm_stderr": 0.01197150729498278
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.25735294117647056,
"acc_stderr": 0.026556519470041513,
"acc_norm": 0.25735294117647056,
"acc_norm_stderr": 0.026556519470041513
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3545751633986928,
"acc_stderr": 0.019353360547553714,
"acc_norm": 0.3545751633986928,
"acc_norm_stderr": 0.019353360547553714
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.38181818181818183,
"acc_stderr": 0.04653429807913509,
"acc_norm": 0.38181818181818183,
"acc_norm_stderr": 0.04653429807913509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5020408163265306,
"acc_stderr": 0.0320089533497105,
"acc_norm": 0.5020408163265306,
"acc_norm_stderr": 0.0320089533497105
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.3781094527363184,
"acc_stderr": 0.03428867848778657,
"acc_norm": 0.3781094527363184,
"acc_norm_stderr": 0.03428867848778657
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.58,
"acc_stderr": 0.04960449637488583,
"acc_norm": 0.58,
"acc_norm_stderr": 0.04960449637488583
},
"harness|hendrycksTest-virology|5": {
"acc": 0.39156626506024095,
"acc_stderr": 0.03799857454479636,
"acc_norm": 0.39156626506024095,
"acc_norm_stderr": 0.03799857454479636
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.4152046783625731,
"acc_stderr": 0.03779275945503201,
"acc_norm": 0.4152046783625731,
"acc_norm_stderr": 0.03779275945503201
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2729498164014688,
"mc1_stderr": 0.015594753632006518,
"mc2": 0.46641168216975853,
"mc2_stderr": 0.016269583261373614
},
"harness|winogrande|5": {
"acc": 0.5943172849250198,
"acc_stderr": 0.013800206336014203
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
jjjaehee/customcoopang | jjjaehee | "2024-03-11T12:09:53Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:04:18Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 5826
num_examples: 39
download_size: 2572
dataset_size: 5826
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Hardik1234/reactjs-train | Hardik1234 | "2024-03-10T11:06:50Z" | 0 | 0 | [
"size_categories:100K<n<1M",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:06:26Z" | ---
dataset_info:
features:
- name: path
dtype: string
- name: repo_name
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 1646910413
num_examples: 410387
download_size: 621037694
dataset_size: 1646910413
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_mychen76__mistral-7b-merged-slerp | open-llm-leaderboard-old | "2024-03-10T11:07:30Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:07:10Z" | ---
pretty_name: Evaluation run of mychen76/mistral-7b-merged-slerp
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mychen76/mistral-7b-merged-slerp](https://huggingface.co/mychen76/mistral-7b-merged-slerp)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:04:57.263703](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp/blob/main/results_2024-03-10T11-04-57.263703.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6444688446653744,\n\
\ \"acc_stderr\": 0.03217564834975917,\n \"acc_norm\": 0.6448609553287138,\n\
\ \"acc_norm_stderr\": 0.032833467276313325,\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n\
\ \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6476109215017065,\n \"acc_stderr\": 0.013960142600598675,\n\
\ \"acc_norm\": 0.6774744027303754,\n \"acc_norm_stderr\": 0.013659980894277364\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6700856403106951,\n\
\ \"acc_stderr\": 0.004692208279690595,\n \"acc_norm\": 0.8616809400517825,\n\
\ \"acc_norm_stderr\": 0.0034452899250117337\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.7105263157894737,\n \"acc_stderr\": 0.03690677986137283,\n\
\ \"acc_norm\": 0.7105263157894737,\n \"acc_norm_stderr\": 0.03690677986137283\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6830188679245283,\n \"acc_stderr\": 0.02863723563980089,\n\
\ \"acc_norm\": 0.6830188679245283,\n \"acc_norm_stderr\": 0.02863723563980089\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.49,\n \"acc_stderr\": 0.05024183937956912,\n \
\ \"acc_norm\": 0.49,\n \"acc_norm_stderr\": 0.05024183937956912\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.29,\n\
\ \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\": 0.29,\n \
\ \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6473988439306358,\n \"acc_stderr\": 0.036430371689585475,\n\
\ \"acc_norm\": 0.6473988439306358,\n \"acc_norm_stderr\": 0.036430371689585475\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.39215686274509803,\n\
\ \"acc_stderr\": 0.04858083574266345,\n \"acc_norm\": 0.39215686274509803,\n\
\ \"acc_norm_stderr\": 0.04858083574266345\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.04408440022768078,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.04408440022768078\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5829787234042553,\n\
\ \"acc_stderr\": 0.03223276266711712,\n \"acc_norm\": 0.5829787234042553,\n\
\ \"acc_norm_stderr\": 0.03223276266711712\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.047036043419179864,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.047036043419179864\n \
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n \"\
acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41534391534391535,\n \"acc_stderr\": 0.025379524910778405,\n \"\
acc_norm\": 0.41534391534391535,\n \"acc_norm_stderr\": 0.025379524910778405\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.048783173121456316,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.048783173121456316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5073891625615764,\n \"acc_stderr\": 0.035176035403610105,\n \"\
acc_norm\": 0.5073891625615764,\n \"acc_norm_stderr\": 0.035176035403610105\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.0328766675860349,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.0328766675860349\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.029126522834586818,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.029126522834586818\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.02150024957603346,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.02150024957603346\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6538461538461539,\n \"acc_stderr\": 0.02412112541694119,\n \
\ \"acc_norm\": 0.6538461538461539,\n \"acc_norm_stderr\": 0.02412112541694119\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658752,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658752\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8550458715596331,\n \"acc_stderr\": 0.01509421569970048,\n \"\
acc_norm\": 0.8550458715596331,\n \"acc_norm_stderr\": 0.01509421569970048\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5185185185185185,\n \"acc_stderr\": 0.034076320938540516,\n \"\
acc_norm\": 0.5185185185185185,\n \"acc_norm_stderr\": 0.034076320938540516\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474082,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474082\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.0257449025322909,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.0257449025322909\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6905829596412556,\n\
\ \"acc_stderr\": 0.03102441174057221,\n \"acc_norm\": 0.6905829596412556,\n\
\ \"acc_norm_stderr\": 0.03102441174057221\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.8099173553719008,\n \"acc_stderr\": 0.03581796951709282,\n \"\
acc_norm\": 0.8099173553719008,\n \"acc_norm_stderr\": 0.03581796951709282\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7572815533980582,\n \"acc_stderr\": 0.04245022486384495,\n\
\ \"acc_norm\": 0.7572815533980582,\n \"acc_norm_stderr\": 0.04245022486384495\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323793,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323793\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7283236994219653,\n \"acc_stderr\": 0.023948512905468365,\n\
\ \"acc_norm\": 0.7283236994219653,\n \"acc_norm_stderr\": 0.023948512905468365\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.36312849162011174,\n\
\ \"acc_stderr\": 0.016083749986853697,\n \"acc_norm\": 0.36312849162011174,\n\
\ \"acc_norm_stderr\": 0.016083749986853697\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7450980392156863,\n \"acc_stderr\": 0.02495418432487991,\n\
\ \"acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.02495418432487991\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7106109324758842,\n\
\ \"acc_stderr\": 0.025755865922632945,\n \"acc_norm\": 0.7106109324758842,\n\
\ \"acc_norm_stderr\": 0.025755865922632945\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7469135802469136,\n \"acc_stderr\": 0.024191808600712995,\n\
\ \"acc_norm\": 0.7469135802469136,\n \"acc_norm_stderr\": 0.024191808600712995\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4726205997392438,\n\
\ \"acc_stderr\": 0.012751075788015058,\n \"acc_norm\": 0.4726205997392438,\n\
\ \"acc_norm_stderr\": 0.012751075788015058\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6948529411764706,\n \"acc_stderr\": 0.027971541370170598,\n\
\ \"acc_norm\": 0.6948529411764706,\n \"acc_norm_stderr\": 0.027971541370170598\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6748366013071896,\n \"acc_stderr\": 0.01895088677080631,\n \
\ \"acc_norm\": 0.6748366013071896,\n \"acc_norm_stderr\": 0.01895088677080631\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302506,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302506\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.746938775510204,\n \"acc_stderr\": 0.027833023871399673,\n\
\ \"acc_norm\": 0.746938775510204,\n \"acc_norm_stderr\": 0.027833023871399673\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.835820895522388,\n\
\ \"acc_stderr\": 0.026193923544454115,\n \"acc_norm\": 0.835820895522388,\n\
\ \"acc_norm_stderr\": 0.026193923544454115\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4283965728274174,\n\
\ \"mc1_stderr\": 0.017323088597314754,\n \"mc2\": 0.5985018412437423,\n\
\ \"mc2_stderr\": 0.01514980059720055\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8018942383583267,\n \"acc_stderr\": 0.01120186274448705\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6853677028051555,\n \
\ \"acc_stderr\": 0.01279103722733604\n }\n}\n```"
repo_url: https://huggingface.co/mychen76/mistral-7b-merged-slerp
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-04-57.263703.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- '**/details_harness|winogrande|5_2024-03-10T11-04-57.263703.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-04-57.263703.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_04_57.263703
path:
- results_2024-03-10T11-04-57.263703.parquet
- split: latest
path:
- results_2024-03-10T11-04-57.263703.parquet
---
# Dataset Card for Evaluation run of mychen76/mistral-7b-merged-slerp
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mychen76/mistral-7b-merged-slerp](https://huggingface.co/mychen76/mistral-7b-merged-slerp) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:04:57.263703](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-slerp/blob/main/results_2024-03-10T11-04-57.263703.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6444688446653744,
"acc_stderr": 0.03217564834975917,
"acc_norm": 0.6448609553287138,
"acc_norm_stderr": 0.032833467276313325,
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5985018412437423,
"mc2_stderr": 0.01514980059720055
},
"harness|arc:challenge|25": {
"acc": 0.6476109215017065,
"acc_stderr": 0.013960142600598675,
"acc_norm": 0.6774744027303754,
"acc_norm_stderr": 0.013659980894277364
},
"harness|hellaswag|10": {
"acc": 0.6700856403106951,
"acc_stderr": 0.004692208279690595,
"acc_norm": 0.8616809400517825,
"acc_norm_stderr": 0.0034452899250117337
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.7105263157894737,
"acc_stderr": 0.03690677986137283,
"acc_norm": 0.7105263157894737,
"acc_norm_stderr": 0.03690677986137283
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6830188679245283,
"acc_stderr": 0.02863723563980089,
"acc_norm": 0.6830188679245283,
"acc_norm_stderr": 0.02863723563980089
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.036430371689585475,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.036430371689585475
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.04858083574266345,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.04858083574266345
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5829787234042553,
"acc_stderr": 0.03223276266711712,
"acc_norm": 0.5829787234042553,
"acc_norm_stderr": 0.03223276266711712
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5,
"acc_stderr": 0.047036043419179864,
"acc_norm": 0.5,
"acc_norm_stderr": 0.047036043419179864
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41534391534391535,
"acc_stderr": 0.025379524910778405,
"acc_norm": 0.41534391534391535,
"acc_norm_stderr": 0.025379524910778405
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.38,
"acc_stderr": 0.048783173121456316,
"acc_norm": 0.38,
"acc_norm_stderr": 0.048783173121456316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5073891625615764,
"acc_stderr": 0.035176035403610105,
"acc_norm": 0.5073891625615764,
"acc_norm_stderr": 0.035176035403610105
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.0328766675860349,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.0328766675860349
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.029126522834586818,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.029126522834586818
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.02150024957603346,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.02150024957603346
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6538461538461539,
"acc_stderr": 0.02412112541694119,
"acc_norm": 0.6538461538461539,
"acc_norm_stderr": 0.02412112541694119
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658752,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658752
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8550458715596331,
"acc_stderr": 0.01509421569970048,
"acc_norm": 0.8550458715596331,
"acc_norm_stderr": 0.01509421569970048
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.034076320938540516,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.034076320938540516
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474082,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474082
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.0257449025322909,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.0257449025322909
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6905829596412556,
"acc_stderr": 0.03102441174057221,
"acc_norm": 0.6905829596412556,
"acc_norm_stderr": 0.03102441174057221
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.8099173553719008,
"acc_stderr": 0.03581796951709282,
"acc_norm": 0.8099173553719008,
"acc_norm_stderr": 0.03581796951709282
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7572815533980582,
"acc_stderr": 0.04245022486384495,
"acc_norm": 0.7572815533980582,
"acc_norm_stderr": 0.04245022486384495
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323793,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323793
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7283236994219653,
"acc_stderr": 0.023948512905468365,
"acc_norm": 0.7283236994219653,
"acc_norm_stderr": 0.023948512905468365
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.36312849162011174,
"acc_stderr": 0.016083749986853697,
"acc_norm": 0.36312849162011174,
"acc_norm_stderr": 0.016083749986853697
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.02495418432487991,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.02495418432487991
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7106109324758842,
"acc_stderr": 0.025755865922632945,
"acc_norm": 0.7106109324758842,
"acc_norm_stderr": 0.025755865922632945
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7469135802469136,
"acc_stderr": 0.024191808600712995,
"acc_norm": 0.7469135802469136,
"acc_norm_stderr": 0.024191808600712995
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4726205997392438,
"acc_stderr": 0.012751075788015058,
"acc_norm": 0.4726205997392438,
"acc_norm_stderr": 0.012751075788015058
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6948529411764706,
"acc_stderr": 0.027971541370170598,
"acc_norm": 0.6948529411764706,
"acc_norm_stderr": 0.027971541370170598
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6748366013071896,
"acc_stderr": 0.01895088677080631,
"acc_norm": 0.6748366013071896,
"acc_norm_stderr": 0.01895088677080631
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302506,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302506
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.746938775510204,
"acc_stderr": 0.027833023871399673,
"acc_norm": 0.746938775510204,
"acc_norm_stderr": 0.027833023871399673
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.835820895522388,
"acc_stderr": 0.026193923544454115,
"acc_norm": 0.835820895522388,
"acc_norm_stderr": 0.026193923544454115
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4283965728274174,
"mc1_stderr": 0.017323088597314754,
"mc2": 0.5985018412437423,
"mc2_stderr": 0.01514980059720055
},
"harness|winogrande|5": {
"acc": 0.8018942383583267,
"acc_stderr": 0.01120186274448705
},
"harness|gsm8k|5": {
"acc": 0.6853677028051555,
"acc_stderr": 0.01279103722733604
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Zen1t/projects-dataset | Zen1t | "2024-03-10T11:07:24Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-03-10T11:07:24Z" | ---
license: apache-2.0
---
|
open-llm-leaderboard-old/details_mychen76__mistral-7b-merged-ties | open-llm-leaderboard-old | "2024-03-10T11:07:57Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:07:37Z" | ---
pretty_name: Evaluation run of mychen76/mistral-7b-merged-ties
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mychen76/mistral-7b-merged-ties](https://huggingface.co/mychen76/mistral-7b-merged-ties)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mychen76__mistral-7b-merged-ties\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:05:18.535141](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-ties/blob/main/results_2024-03-10T11-05-18.535141.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6445924072176131,\n\
\ \"acc_stderr\": 0.03213293328697562,\n \"acc_norm\": 0.6450342620069291,\n\
\ \"acc_norm_stderr\": 0.032788565108750604,\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140357,\n \"mc2\": 0.6131109579182783,\n\
\ \"mc2_stderr\": 0.015351738756398125\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6390784982935154,\n \"acc_stderr\": 0.014034761386175452,\n\
\ \"acc_norm\": 0.6791808873720137,\n \"acc_norm_stderr\": 0.013640943091946531\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6722764389563832,\n\
\ \"acc_stderr\": 0.004684241685200317,\n \"acc_norm\": 0.85929097789285,\n\
\ \"acc_norm_stderr\": 0.00347010499020439\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.04512608598542128,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.04512608598542128\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6074074074074074,\n\
\ \"acc_stderr\": 0.0421850621536888,\n \"acc_norm\": 0.6074074074074074,\n\
\ \"acc_norm_stderr\": 0.0421850621536888\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.743421052631579,\n \"acc_stderr\": 0.0355418036802569,\n\
\ \"acc_norm\": 0.743421052631579,\n \"acc_norm_stderr\": 0.0355418036802569\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001975,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001975\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7777777777777778,\n\
\ \"acc_stderr\": 0.03476590104304134,\n \"acc_norm\": 0.7777777777777778,\n\
\ \"acc_norm_stderr\": 0.03476590104304134\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\"\
: 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.630057803468208,\n\
\ \"acc_stderr\": 0.036812296333943194,\n \"acc_norm\": 0.630057803468208,\n\
\ \"acc_norm_stderr\": 0.036812296333943194\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.042923469599092816,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.042923469599092816\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.574468085106383,\n \"acc_stderr\": 0.03232146916224468,\n\
\ \"acc_norm\": 0.574468085106383,\n \"acc_norm_stderr\": 0.03232146916224468\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5175438596491229,\n\
\ \"acc_stderr\": 0.04700708033551038,\n \"acc_norm\": 0.5175438596491229,\n\
\ \"acc_norm_stderr\": 0.04700708033551038\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4126984126984127,\n \"acc_stderr\": 0.025355741263055263,\n \"\
acc_norm\": 0.4126984126984127,\n \"acc_norm_stderr\": 0.025355741263055263\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4365079365079365,\n\
\ \"acc_stderr\": 0.04435932892851466,\n \"acc_norm\": 0.4365079365079365,\n\
\ \"acc_norm_stderr\": 0.04435932892851466\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7645161290322581,\n \"acc_stderr\": 0.02413763242933771,\n \"\
acc_norm\": 0.7645161290322581,\n \"acc_norm_stderr\": 0.02413763242933771\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.67,\n \"acc_stderr\": 0.04725815626252607,\n \"acc_norm\"\
: 0.67,\n \"acc_norm_stderr\": 0.04725815626252607\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8963730569948186,\n \"acc_stderr\": 0.02199531196364424,\n\
\ \"acc_norm\": 0.8963730569948186,\n \"acc_norm_stderr\": 0.02199531196364424\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6410256410256411,\n \"acc_stderr\": 0.024321738484602354,\n\
\ \"acc_norm\": 0.6410256410256411,\n \"acc_norm_stderr\": 0.024321738484602354\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616255,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616255\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6890756302521008,\n \"acc_stderr\": 0.03006676158297793,\n \
\ \"acc_norm\": 0.6890756302521008,\n \"acc_norm_stderr\": 0.03006676158297793\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2980132450331126,\n \"acc_stderr\": 0.037345356767871984,\n \"\
acc_norm\": 0.2980132450331126,\n \"acc_norm_stderr\": 0.037345356767871984\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8186274509803921,\n \"acc_stderr\": 0.027044621719474086,\n \"\
acc_norm\": 0.8186274509803921,\n \"acc_norm_stderr\": 0.027044621719474086\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8185654008438819,\n \"acc_stderr\": 0.025085961144579665,\n \
\ \"acc_norm\": 0.8185654008438819,\n \"acc_norm_stderr\": 0.025085961144579665\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7862595419847328,\n \"acc_stderr\": 0.0359546161177469,\n\
\ \"acc_norm\": 0.7862595419847328,\n \"acc_norm_stderr\": 0.0359546161177469\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7607361963190185,\n \"acc_stderr\": 0.033519538795212696,\n\
\ \"acc_norm\": 0.7607361963190185,\n \"acc_norm_stderr\": 0.033519538795212696\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4642857142857143,\n\
\ \"acc_stderr\": 0.04733667890053756,\n \"acc_norm\": 0.4642857142857143,\n\
\ \"acc_norm_stderr\": 0.04733667890053756\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7766990291262136,\n \"acc_stderr\": 0.04123553189891431,\n\
\ \"acc_norm\": 0.7766990291262136,\n \"acc_norm_stderr\": 0.04123553189891431\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8301404853128991,\n\
\ \"acc_stderr\": 0.013428186370608304,\n \"acc_norm\": 0.8301404853128991,\n\
\ \"acc_norm_stderr\": 0.013428186370608304\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7369942196531792,\n \"acc_stderr\": 0.023703099525258172,\n\
\ \"acc_norm\": 0.7369942196531792,\n \"acc_norm_stderr\": 0.023703099525258172\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3664804469273743,\n\
\ \"acc_stderr\": 0.016115235504865467,\n \"acc_norm\": 0.3664804469273743,\n\
\ \"acc_norm_stderr\": 0.016115235504865467\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242553,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242553\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7170418006430869,\n\
\ \"acc_stderr\": 0.02558306248998481,\n \"acc_norm\": 0.7170418006430869,\n\
\ \"acc_norm_stderr\": 0.02558306248998481\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7376543209876543,\n \"acc_stderr\": 0.024477222856135114,\n\
\ \"acc_norm\": 0.7376543209876543,\n \"acc_norm_stderr\": 0.024477222856135114\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5070921985815603,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.5070921985815603,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4667535853976532,\n\
\ \"acc_stderr\": 0.01274197433389723,\n \"acc_norm\": 0.4667535853976532,\n\
\ \"acc_norm_stderr\": 0.01274197433389723\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6764705882352942,\n \"acc_stderr\": 0.02841820861940676,\n\
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.02841820861940676\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6666666666666666,\n \"acc_stderr\": 0.019070985589687495,\n \
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.019070985589687495\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578337,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578337\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4455324357405141,\n\
\ \"mc1_stderr\": 0.017399335280140357,\n \"mc2\": 0.6131109579182783,\n\
\ \"mc2_stderr\": 0.015351738756398125\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8003157063930545,\n \"acc_stderr\": 0.011235328382625842\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6899166034874905,\n \
\ \"acc_stderr\": 0.01274030571737627\n }\n}\n```"
repo_url: https://huggingface.co/mychen76/mistral-7b-merged-ties
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-05-18.535141.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-05-18.535141.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- '**/details_harness|winogrande|5_2024-03-10T11-05-18.535141.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-05-18.535141.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_05_18.535141
path:
- results_2024-03-10T11-05-18.535141.parquet
- split: latest
path:
- results_2024-03-10T11-05-18.535141.parquet
---
# Dataset Card for Evaluation run of mychen76/mistral-7b-merged-ties
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mychen76/mistral-7b-merged-ties](https://huggingface.co/mychen76/mistral-7b-merged-ties) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mychen76__mistral-7b-merged-ties",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:05:18.535141](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-ties/blob/main/results_2024-03-10T11-05-18.535141.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6445924072176131,
"acc_stderr": 0.03213293328697562,
"acc_norm": 0.6450342620069291,
"acc_norm_stderr": 0.032788565108750604,
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140357,
"mc2": 0.6131109579182783,
"mc2_stderr": 0.015351738756398125
},
"harness|arc:challenge|25": {
"acc": 0.6390784982935154,
"acc_stderr": 0.014034761386175452,
"acc_norm": 0.6791808873720137,
"acc_norm_stderr": 0.013640943091946531
},
"harness|hellaswag|10": {
"acc": 0.6722764389563832,
"acc_stderr": 0.004684241685200317,
"acc_norm": 0.85929097789285,
"acc_norm_stderr": 0.00347010499020439
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542128,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542128
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6074074074074074,
"acc_stderr": 0.0421850621536888,
"acc_norm": 0.6074074074074074,
"acc_norm_stderr": 0.0421850621536888
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.743421052631579,
"acc_stderr": 0.0355418036802569,
"acc_norm": 0.743421052631579,
"acc_norm_stderr": 0.0355418036802569
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.03476590104304134,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.03476590104304134
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.630057803468208,
"acc_stderr": 0.036812296333943194,
"acc_norm": 0.630057803468208,
"acc_norm_stderr": 0.036812296333943194
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.042923469599092816,
"acc_norm": 0.76,
"acc_norm_stderr": 0.042923469599092816
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.574468085106383,
"acc_stderr": 0.03232146916224468,
"acc_norm": 0.574468085106383,
"acc_norm_stderr": 0.03232146916224468
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5175438596491229,
"acc_stderr": 0.04700708033551038,
"acc_norm": 0.5175438596491229,
"acc_norm_stderr": 0.04700708033551038
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4126984126984127,
"acc_stderr": 0.025355741263055263,
"acc_norm": 0.4126984126984127,
"acc_norm_stderr": 0.025355741263055263
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4365079365079365,
"acc_stderr": 0.04435932892851466,
"acc_norm": 0.4365079365079365,
"acc_norm_stderr": 0.04435932892851466
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7645161290322581,
"acc_stderr": 0.02413763242933771,
"acc_norm": 0.7645161290322581,
"acc_norm_stderr": 0.02413763242933771
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.67,
"acc_stderr": 0.04725815626252607,
"acc_norm": 0.67,
"acc_norm_stderr": 0.04725815626252607
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8963730569948186,
"acc_stderr": 0.02199531196364424,
"acc_norm": 0.8963730569948186,
"acc_norm_stderr": 0.02199531196364424
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6410256410256411,
"acc_stderr": 0.024321738484602354,
"acc_norm": 0.6410256410256411,
"acc_norm_stderr": 0.024321738484602354
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616255,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616255
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6890756302521008,
"acc_stderr": 0.03006676158297793,
"acc_norm": 0.6890756302521008,
"acc_norm_stderr": 0.03006676158297793
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2980132450331126,
"acc_stderr": 0.037345356767871984,
"acc_norm": 0.2980132450331126,
"acc_norm_stderr": 0.037345356767871984
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8186274509803921,
"acc_stderr": 0.027044621719474086,
"acc_norm": 0.8186274509803921,
"acc_norm_stderr": 0.027044621719474086
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8185654008438819,
"acc_stderr": 0.025085961144579665,
"acc_norm": 0.8185654008438819,
"acc_norm_stderr": 0.025085961144579665
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7862595419847328,
"acc_stderr": 0.0359546161177469,
"acc_norm": 0.7862595419847328,
"acc_norm_stderr": 0.0359546161177469
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7607361963190185,
"acc_stderr": 0.033519538795212696,
"acc_norm": 0.7607361963190185,
"acc_norm_stderr": 0.033519538795212696
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4642857142857143,
"acc_stderr": 0.04733667890053756,
"acc_norm": 0.4642857142857143,
"acc_norm_stderr": 0.04733667890053756
},
"harness|hendrycksTest-management|5": {
"acc": 0.7766990291262136,
"acc_stderr": 0.04123553189891431,
"acc_norm": 0.7766990291262136,
"acc_norm_stderr": 0.04123553189891431
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8301404853128991,
"acc_stderr": 0.013428186370608304,
"acc_norm": 0.8301404853128991,
"acc_norm_stderr": 0.013428186370608304
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7369942196531792,
"acc_stderr": 0.023703099525258172,
"acc_norm": 0.7369942196531792,
"acc_norm_stderr": 0.023703099525258172
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3664804469273743,
"acc_stderr": 0.016115235504865467,
"acc_norm": 0.3664804469273743,
"acc_norm_stderr": 0.016115235504865467
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242553,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242553
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7170418006430869,
"acc_stderr": 0.02558306248998481,
"acc_norm": 0.7170418006430869,
"acc_norm_stderr": 0.02558306248998481
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7376543209876543,
"acc_stderr": 0.024477222856135114,
"acc_norm": 0.7376543209876543,
"acc_norm_stderr": 0.024477222856135114
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5070921985815603,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.5070921985815603,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4667535853976532,
"acc_stderr": 0.01274197433389723,
"acc_norm": 0.4667535853976532,
"acc_norm_stderr": 0.01274197433389723
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.02841820861940676,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.02841820861940676
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.019070985589687495,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.019070985589687495
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578337,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578337
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4455324357405141,
"mc1_stderr": 0.017399335280140357,
"mc2": 0.6131109579182783,
"mc2_stderr": 0.015351738756398125
},
"harness|winogrande|5": {
"acc": 0.8003157063930545,
"acc_stderr": 0.011235328382625842
},
"harness|gsm8k|5": {
"acc": 0.6899166034874905,
"acc_stderr": 0.01274030571737627
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
open-llm-leaderboard-old/details_mychen76__mistral-7b-merged-dare | open-llm-leaderboard-old | "2024-03-10T11:09:00Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:08:40Z" | ---
pretty_name: Evaluation run of mychen76/mistral-7b-merged-dare
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [mychen76/mistral-7b-merged-dare](https://huggingface.co/mychen76/mistral-7b-merged-dare)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:06:23.658904](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare/blob/main/results_2024-03-10T11-06-23.658904.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6555984630611819,\n\
\ \"acc_stderr\": 0.03202413494937558,\n \"acc_norm\": 0.6552172318444804,\n\
\ \"acc_norm_stderr\": 0.03269035140382117,\n \"mc1\": 0.46878824969400246,\n\
\ \"mc1_stderr\": 0.017469364874577537,\n \"mc2\": 0.6324196158065736,\n\
\ \"mc2_stderr\": 0.015183642172146008\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6655290102389079,\n \"acc_stderr\": 0.013787460322441374,\n\
\ \"acc_norm\": 0.697098976109215,\n \"acc_norm_stderr\": 0.013428241573185349\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6896036646086438,\n\
\ \"acc_stderr\": 0.004617103280372031,\n \"acc_norm\": 0.8705437163911571,\n\
\ \"acc_norm_stderr\": 0.003350181812941611\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6842105263157895,\n \"acc_stderr\": 0.0378272898086547,\n\
\ \"acc_norm\": 0.6842105263157895,\n \"acc_norm_stderr\": 0.0378272898086547\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7245283018867924,\n \"acc_stderr\": 0.027495663683724057,\n\
\ \"acc_norm\": 0.7245283018867924,\n \"acc_norm_stderr\": 0.027495663683724057\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4411764705882353,\n \"acc_stderr\": 0.049406356306056595,\n\
\ \"acc_norm\": 0.4411764705882353,\n \"acc_norm_stderr\": 0.049406356306056595\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.76,\n \"acc_stderr\": 0.04292346959909282,\n \"acc_norm\": 0.76,\n\
\ \"acc_norm_stderr\": 0.04292346959909282\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.6,\n \"acc_stderr\": 0.03202563076101735,\n \
\ \"acc_norm\": 0.6,\n \"acc_norm_stderr\": 0.03202563076101735\n },\n\
\ \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5655172413793104,\n \"acc_stderr\": 0.04130740879555498,\n\
\ \"acc_norm\": 0.5655172413793104,\n \"acc_norm_stderr\": 0.04130740879555498\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42063492063492064,\n \"acc_stderr\": 0.025424835086924,\n \"acc_norm\"\
: 0.42063492063492064,\n \"acc_norm_stderr\": 0.025424835086924\n },\n\
\ \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.47619047619047616,\n\
\ \"acc_stderr\": 0.04467062628403273,\n \"acc_norm\": 0.47619047619047616,\n\
\ \"acc_norm_stderr\": 0.04467062628403273\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.048241815132442176,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.048241815132442176\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7774193548387097,\n \"acc_stderr\": 0.023664216671642518,\n \"\
acc_norm\": 0.7774193548387097,\n \"acc_norm_stderr\": 0.023664216671642518\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.72,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\"\
: 0.72,\n \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7757575757575758,\n \"acc_stderr\": 0.03256866661681102,\n\
\ \"acc_norm\": 0.7757575757575758,\n \"acc_norm_stderr\": 0.03256866661681102\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7878787878787878,\n \"acc_stderr\": 0.02912652283458682,\n \"\
acc_norm\": 0.7878787878787878,\n \"acc_norm_stderr\": 0.02912652283458682\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.020986854593289733,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.020986854593289733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.32222222222222224,\n \"acc_stderr\": 0.028493465091028593,\n \
\ \"acc_norm\": 0.32222222222222224,\n \"acc_norm_stderr\": 0.028493465091028593\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6848739495798319,\n \"acc_stderr\": 0.030176808288974337,\n\
\ \"acc_norm\": 0.6848739495798319,\n \"acc_norm_stderr\": 0.030176808288974337\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3576158940397351,\n \"acc_stderr\": 0.03913453431177258,\n \"\
acc_norm\": 0.3576158940397351,\n \"acc_norm_stderr\": 0.03913453431177258\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5324074074074074,\n \"acc_stderr\": 0.03402801581358966,\n \"\
acc_norm\": 0.5324074074074074,\n \"acc_norm_stderr\": 0.03402801581358966\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8529411764705882,\n \"acc_stderr\": 0.024857478080250454,\n \"\
acc_norm\": 0.8529411764705882,\n \"acc_norm_stderr\": 0.024857478080250454\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.031146796482972465,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.031146796482972465\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.8091603053435115,\n \"acc_stderr\": 0.03446513350752599,\n\
\ \"acc_norm\": 0.8091603053435115,\n \"acc_norm_stderr\": 0.03446513350752599\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.039578354719809805,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.039578354719809805\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.42857142857142855,\n\
\ \"acc_stderr\": 0.04697113923010212,\n \"acc_norm\": 0.42857142857142855,\n\
\ \"acc_norm_stderr\": 0.04697113923010212\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.01326534626132379,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.01326534626132379\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.02344582627654554,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.02344582627654554\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.42569832402234636,\n\
\ \"acc_stderr\": 0.016536829648997112,\n \"acc_norm\": 0.42569832402234636,\n\
\ \"acc_norm_stderr\": 0.016536829648997112\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.738562091503268,\n \"acc_stderr\": 0.025160998214292452,\n\
\ \"acc_norm\": 0.738562091503268,\n \"acc_norm_stderr\": 0.025160998214292452\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.02567025924218893,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.02567025924218893\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.0242885336377261,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.0242885336377261\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46740547588005216,\n\
\ \"acc_stderr\": 0.012743072942653342,\n \"acc_norm\": 0.46740547588005216,\n\
\ \"acc_norm_stderr\": 0.012743072942653342\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6764705882352942,\n \"acc_stderr\": 0.018926082916083383,\n \
\ \"acc_norm\": 0.6764705882352942,\n \"acc_norm_stderr\": 0.018926082916083383\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6818181818181818,\n\
\ \"acc_stderr\": 0.044612721759105085,\n \"acc_norm\": 0.6818181818181818,\n\
\ \"acc_norm_stderr\": 0.044612721759105085\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.726530612244898,\n \"acc_stderr\": 0.02853556033712844,\n\
\ \"acc_norm\": 0.726530612244898,\n \"acc_norm_stderr\": 0.02853556033712844\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8407960199004975,\n\
\ \"acc_stderr\": 0.02587064676616913,\n \"acc_norm\": 0.8407960199004975,\n\
\ \"acc_norm_stderr\": 0.02587064676616913\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.86,\n \"acc_stderr\": 0.0348735088019777,\n \
\ \"acc_norm\": 0.86,\n \"acc_norm_stderr\": 0.0348735088019777\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5421686746987951,\n\
\ \"acc_stderr\": 0.0387862677100236,\n \"acc_norm\": 0.5421686746987951,\n\
\ \"acc_norm_stderr\": 0.0387862677100236\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8421052631578947,\n \"acc_stderr\": 0.027966785859160893,\n\
\ \"acc_norm\": 0.8421052631578947,\n \"acc_norm_stderr\": 0.027966785859160893\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.46878824969400246,\n\
\ \"mc1_stderr\": 0.017469364874577537,\n \"mc2\": 0.6324196158065736,\n\
\ \"mc2_stderr\": 0.015183642172146008\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8161010260457774,\n \"acc_stderr\": 0.010887916013305889\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.730098559514784,\n \
\ \"acc_stderr\": 0.012227442856468897\n }\n}\n```"
repo_url: https://huggingface.co/mychen76/mistral-7b-merged-dare
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-06-23.658904.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-06-23.658904.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- '**/details_harness|winogrande|5_2024-03-10T11-06-23.658904.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-06-23.658904.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_06_23.658904
path:
- results_2024-03-10T11-06-23.658904.parquet
- split: latest
path:
- results_2024-03-10T11-06-23.658904.parquet
---
# Dataset Card for Evaluation run of mychen76/mistral-7b-merged-dare
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [mychen76/mistral-7b-merged-dare](https://huggingface.co/mychen76/mistral-7b-merged-dare) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:06:23.658904](https://huggingface.co/datasets/open-llm-leaderboard/details_mychen76__mistral-7b-merged-dare/blob/main/results_2024-03-10T11-06-23.658904.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6555984630611819,
"acc_stderr": 0.03202413494937558,
"acc_norm": 0.6552172318444804,
"acc_norm_stderr": 0.03269035140382117,
"mc1": 0.46878824969400246,
"mc1_stderr": 0.017469364874577537,
"mc2": 0.6324196158065736,
"mc2_stderr": 0.015183642172146008
},
"harness|arc:challenge|25": {
"acc": 0.6655290102389079,
"acc_stderr": 0.013787460322441374,
"acc_norm": 0.697098976109215,
"acc_norm_stderr": 0.013428241573185349
},
"harness|hellaswag|10": {
"acc": 0.6896036646086438,
"acc_stderr": 0.004617103280372031,
"acc_norm": 0.8705437163911571,
"acc_norm_stderr": 0.003350181812941611
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6842105263157895,
"acc_stderr": 0.0378272898086547,
"acc_norm": 0.6842105263157895,
"acc_norm_stderr": 0.0378272898086547
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7245283018867924,
"acc_stderr": 0.027495663683724057,
"acc_norm": 0.7245283018867924,
"acc_norm_stderr": 0.027495663683724057
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.76,
"acc_stderr": 0.04292346959909282,
"acc_norm": 0.76,
"acc_norm_stderr": 0.04292346959909282
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.6,
"acc_stderr": 0.03202563076101735,
"acc_norm": 0.6,
"acc_norm_stderr": 0.03202563076101735
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5655172413793104,
"acc_stderr": 0.04130740879555498,
"acc_norm": 0.5655172413793104,
"acc_norm_stderr": 0.04130740879555498
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42063492063492064,
"acc_stderr": 0.025424835086924,
"acc_norm": 0.42063492063492064,
"acc_norm_stderr": 0.025424835086924
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.47619047619047616,
"acc_stderr": 0.04467062628403273,
"acc_norm": 0.47619047619047616,
"acc_norm_stderr": 0.04467062628403273
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.36,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7774193548387097,
"acc_stderr": 0.023664216671642518,
"acc_norm": 0.7774193548387097,
"acc_norm_stderr": 0.023664216671642518
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7757575757575758,
"acc_stderr": 0.03256866661681102,
"acc_norm": 0.7757575757575758,
"acc_norm_stderr": 0.03256866661681102
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7878787878787878,
"acc_stderr": 0.02912652283458682,
"acc_norm": 0.7878787878787878,
"acc_norm_stderr": 0.02912652283458682
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.020986854593289733,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.020986854593289733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.32222222222222224,
"acc_stderr": 0.028493465091028593,
"acc_norm": 0.32222222222222224,
"acc_norm_stderr": 0.028493465091028593
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6848739495798319,
"acc_stderr": 0.030176808288974337,
"acc_norm": 0.6848739495798319,
"acc_norm_stderr": 0.030176808288974337
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3576158940397351,
"acc_stderr": 0.03913453431177258,
"acc_norm": 0.3576158940397351,
"acc_norm_stderr": 0.03913453431177258
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5324074074074074,
"acc_stderr": 0.03402801581358966,
"acc_norm": 0.5324074074074074,
"acc_norm_stderr": 0.03402801581358966
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8529411764705882,
"acc_stderr": 0.024857478080250454,
"acc_norm": 0.8529411764705882,
"acc_norm_stderr": 0.024857478080250454
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.031146796482972465,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.031146796482972465
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.8091603053435115,
"acc_stderr": 0.03446513350752599,
"acc_norm": 0.8091603053435115,
"acc_norm_stderr": 0.03446513350752599
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.039578354719809805,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.039578354719809805
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.42857142857142855,
"acc_stderr": 0.04697113923010212,
"acc_norm": 0.42857142857142855,
"acc_norm_stderr": 0.04697113923010212
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.01326534626132379,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.01326534626132379
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.02344582627654554,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.02344582627654554
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.42569832402234636,
"acc_stderr": 0.016536829648997112,
"acc_norm": 0.42569832402234636,
"acc_norm_stderr": 0.016536829648997112
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.738562091503268,
"acc_stderr": 0.025160998214292452,
"acc_norm": 0.738562091503268,
"acc_norm_stderr": 0.025160998214292452
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.02567025924218893,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.02567025924218893
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.0242885336377261,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.0242885336377261
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46740547588005216,
"acc_stderr": 0.012743072942653342,
"acc_norm": 0.46740547588005216,
"acc_norm_stderr": 0.012743072942653342
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6764705882352942,
"acc_stderr": 0.018926082916083383,
"acc_norm": 0.6764705882352942,
"acc_norm_stderr": 0.018926082916083383
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6818181818181818,
"acc_stderr": 0.044612721759105085,
"acc_norm": 0.6818181818181818,
"acc_norm_stderr": 0.044612721759105085
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.726530612244898,
"acc_stderr": 0.02853556033712844,
"acc_norm": 0.726530612244898,
"acc_norm_stderr": 0.02853556033712844
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8407960199004975,
"acc_stderr": 0.02587064676616913,
"acc_norm": 0.8407960199004975,
"acc_norm_stderr": 0.02587064676616913
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.86,
"acc_stderr": 0.0348735088019777,
"acc_norm": 0.86,
"acc_norm_stderr": 0.0348735088019777
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5421686746987951,
"acc_stderr": 0.0387862677100236,
"acc_norm": 0.5421686746987951,
"acc_norm_stderr": 0.0387862677100236
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8421052631578947,
"acc_stderr": 0.027966785859160893,
"acc_norm": 0.8421052631578947,
"acc_norm_stderr": 0.027966785859160893
},
"harness|truthfulqa:mc|0": {
"mc1": 0.46878824969400246,
"mc1_stderr": 0.017469364874577537,
"mc2": 0.6324196158065736,
"mc2_stderr": 0.015183642172146008
},
"harness|winogrande|5": {
"acc": 0.8161010260457774,
"acc_stderr": 0.010887916013305889
},
"harness|gsm8k|5": {
"acc": 0.730098559514784,
"acc_stderr": 0.012227442856468897
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
BambiMC/ts_test_2 | BambiMC | "2024-03-13T19:06:32Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:11:57Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 110880
num_examples: 576
download_size: 2240
dataset_size: 110880
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_frankenmerger__gemoy-4b-instruct-scientific | open-llm-leaderboard-old | "2024-03-10T11:13:08Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:12:48Z" | ---
pretty_name: Evaluation run of frankenmerger/gemoy-4b-instruct-scientific
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/gemoy-4b-instruct-scientific](https://huggingface.co/frankenmerger/gemoy-4b-instruct-scientific)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:10:43.531199](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific/blob/main/results_2024-03-10T11-10-43.531199.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.3887480937200939,\n\
\ \"acc_stderr\": 0.033967527013847434,\n \"acc_norm\": 0.3919353670094879,\n\
\ \"acc_norm_stderr\": 0.03472007289325813,\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.4195962328166831,\n\
\ \"mc2_stderr\": 0.014414337460874078\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.39334470989761094,\n \"acc_stderr\": 0.014275101465693026,\n\
\ \"acc_norm\": 0.4197952218430034,\n \"acc_norm_stderr\": 0.014422181226303026\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.46106353316072496,\n\
\ \"acc_stderr\": 0.004974628903829138,\n \"acc_norm\": 0.6304521011750648,\n\
\ \"acc_norm_stderr\": 0.004816958817726085\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.42962962962962964,\n\
\ \"acc_stderr\": 0.04276349494376599,\n \"acc_norm\": 0.42962962962962964,\n\
\ \"acc_norm_stderr\": 0.04276349494376599\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3092105263157895,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.3092105263157895,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.45,\n\
\ \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.3849056603773585,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.3849056603773585,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.4166666666666667,\n\
\ \"acc_stderr\": 0.041227287076512825,\n \"acc_norm\": 0.4166666666666667,\n\
\ \"acc_norm_stderr\": 0.041227287076512825\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \"acc_norm\": 0.37,\n\
\ \"acc_norm_stderr\": 0.04852365870939099\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.04408440022768077,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.04408440022768077\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.3468208092485549,\n\
\ \"acc_stderr\": 0.03629146670159663,\n \"acc_norm\": 0.3468208092485549,\n\
\ \"acc_norm_stderr\": 0.03629146670159663\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.22549019607843138,\n \"acc_stderr\": 0.041583075330832865,\n\
\ \"acc_norm\": 0.22549019607843138,\n \"acc_norm_stderr\": 0.041583075330832865\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.51,\n \"acc_stderr\": 0.05024183937956911,\n \"acc_norm\": 0.51,\n\
\ \"acc_norm_stderr\": 0.05024183937956911\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.32340425531914896,\n \"acc_stderr\": 0.03057944277361034,\n\
\ \"acc_norm\": 0.32340425531914896,\n \"acc_norm_stderr\": 0.03057944277361034\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2543859649122807,\n\
\ \"acc_stderr\": 0.040969851398436716,\n \"acc_norm\": 0.2543859649122807,\n\
\ \"acc_norm_stderr\": 0.040969851398436716\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.2566137566137566,\n \"acc_stderr\": 0.022494510767503154,\n \"\
acc_norm\": 0.2566137566137566,\n \"acc_norm_stderr\": 0.022494510767503154\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2619047619047619,\n\
\ \"acc_stderr\": 0.03932537680392871,\n \"acc_norm\": 0.2619047619047619,\n\
\ \"acc_norm_stderr\": 0.03932537680392871\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.38064516129032255,\n \"acc_stderr\": 0.02762171783290703,\n \"\
acc_norm\": 0.38064516129032255,\n \"acc_norm_stderr\": 0.02762171783290703\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.26108374384236455,\n \"acc_stderr\": 0.030903796952114482,\n \"\
acc_norm\": 0.26108374384236455,\n \"acc_norm_stderr\": 0.030903796952114482\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145633,\n \"acc_norm\"\
: 0.38,\n \"acc_norm_stderr\": 0.04878317312145633\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.4727272727272727,\n \"acc_stderr\": 0.03898531605579419,\n\
\ \"acc_norm\": 0.4727272727272727,\n \"acc_norm_stderr\": 0.03898531605579419\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.43434343434343436,\n \"acc_stderr\": 0.03531505879359183,\n \"\
acc_norm\": 0.43434343434343436,\n \"acc_norm_stderr\": 0.03531505879359183\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.47668393782383417,\n \"acc_stderr\": 0.03604513672442207,\n\
\ \"acc_norm\": 0.47668393782383417,\n \"acc_norm_stderr\": 0.03604513672442207\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.38461538461538464,\n \"acc_stderr\": 0.024666744915187222,\n\
\ \"acc_norm\": 0.38461538461538464,\n \"acc_norm_stderr\": 0.024666744915187222\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.3277310924369748,\n \"acc_stderr\": 0.030489911417673227,\n\
\ \"acc_norm\": 0.3277310924369748,\n \"acc_norm_stderr\": 0.030489911417673227\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.03511807571804725,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.03511807571804725\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.45688073394495415,\n \"acc_stderr\": 0.021357458785226206,\n \"\
acc_norm\": 0.45688073394495415,\n \"acc_norm_stderr\": 0.021357458785226206\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.23148148148148148,\n \"acc_stderr\": 0.028765111718046934,\n \"\
acc_norm\": 0.23148148148148148,\n \"acc_norm_stderr\": 0.028765111718046934\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.4117647058823529,\n \"acc_stderr\": 0.034542365853806094,\n \"\
acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.034542365853806094\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.5358649789029536,\n \"acc_stderr\": 0.03246338898055659,\n \
\ \"acc_norm\": 0.5358649789029536,\n \"acc_norm_stderr\": 0.03246338898055659\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.42152466367713004,\n\
\ \"acc_stderr\": 0.033141902221106564,\n \"acc_norm\": 0.42152466367713004,\n\
\ \"acc_norm_stderr\": 0.033141902221106564\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.45038167938931295,\n \"acc_stderr\": 0.04363643698524779,\n\
\ \"acc_norm\": 0.45038167938931295,\n \"acc_norm_stderr\": 0.04363643698524779\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.5619834710743802,\n \"acc_stderr\": 0.04529146804435792,\n \"\
acc_norm\": 0.5619834710743802,\n \"acc_norm_stderr\": 0.04529146804435792\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.04830366024635331,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.04830366024635331\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.44171779141104295,\n \"acc_stderr\": 0.039015918258361836,\n\
\ \"acc_norm\": 0.44171779141104295,\n \"acc_norm_stderr\": 0.039015918258361836\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.32142857142857145,\n\
\ \"acc_stderr\": 0.04432804055291519,\n \"acc_norm\": 0.32142857142857145,\n\
\ \"acc_norm_stderr\": 0.04432804055291519\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.5145631067961165,\n \"acc_stderr\": 0.049486373240266356,\n\
\ \"acc_norm\": 0.5145631067961165,\n \"acc_norm_stderr\": 0.049486373240266356\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.6367521367521367,\n\
\ \"acc_stderr\": 0.03150712523091264,\n \"acc_norm\": 0.6367521367521367,\n\
\ \"acc_norm_stderr\": 0.03150712523091264\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695235,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695235\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.508301404853129,\n\
\ \"acc_stderr\": 0.017877498991072008,\n \"acc_norm\": 0.508301404853129,\n\
\ \"acc_norm_stderr\": 0.017877498991072008\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.4046242774566474,\n \"acc_stderr\": 0.026424816594009852,\n\
\ \"acc_norm\": 0.4046242774566474,\n \"acc_norm_stderr\": 0.026424816594009852\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2536312849162011,\n\
\ \"acc_stderr\": 0.014551553659369923,\n \"acc_norm\": 0.2536312849162011,\n\
\ \"acc_norm_stderr\": 0.014551553659369923\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.41830065359477125,\n \"acc_stderr\": 0.028245134024387282,\n\
\ \"acc_norm\": 0.41830065359477125,\n \"acc_norm_stderr\": 0.028245134024387282\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.3954983922829582,\n\
\ \"acc_stderr\": 0.027770918531427834,\n \"acc_norm\": 0.3954983922829582,\n\
\ \"acc_norm_stderr\": 0.027770918531427834\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.39814814814814814,\n \"acc_stderr\": 0.027237415094592477,\n\
\ \"acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.027237415094592477\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.31560283687943264,\n \"acc_stderr\": 0.027724989449509314,\n \
\ \"acc_norm\": 0.31560283687943264,\n \"acc_norm_stderr\": 0.027724989449509314\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3005215123859192,\n\
\ \"acc_stderr\": 0.011709918883039122,\n \"acc_norm\": 0.3005215123859192,\n\
\ \"acc_norm_stderr\": 0.011709918883039122\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.22058823529411764,\n \"acc_stderr\": 0.025187786660227272,\n\
\ \"acc_norm\": 0.22058823529411764,\n \"acc_norm_stderr\": 0.025187786660227272\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.3709150326797386,\n \"acc_stderr\": 0.019542101564854118,\n \
\ \"acc_norm\": 0.3709150326797386,\n \"acc_norm_stderr\": 0.019542101564854118\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.44545454545454544,\n\
\ \"acc_stderr\": 0.047605488214603246,\n \"acc_norm\": 0.44545454545454544,\n\
\ \"acc_norm_stderr\": 0.047605488214603246\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.37551020408163266,\n \"acc_stderr\": 0.031001209039894836,\n\
\ \"acc_norm\": 0.37551020408163266,\n \"acc_norm_stderr\": 0.031001209039894836\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.5174129353233831,\n\
\ \"acc_stderr\": 0.03533389234739245,\n \"acc_norm\": 0.5174129353233831,\n\
\ \"acc_norm_stderr\": 0.03533389234739245\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.61,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.61,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4397590361445783,\n\
\ \"acc_stderr\": 0.03864139923699122,\n \"acc_norm\": 0.4397590361445783,\n\
\ \"acc_norm_stderr\": 0.03864139923699122\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.5146198830409356,\n \"acc_stderr\": 0.038331852752130254,\n\
\ \"acc_norm\": 0.5146198830409356,\n \"acc_norm_stderr\": 0.038331852752130254\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.26438188494492043,\n\
\ \"mc1_stderr\": 0.015438211119522514,\n \"mc2\": 0.4195962328166831,\n\
\ \"mc2_stderr\": 0.014414337460874078\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6306235201262825,\n \"acc_stderr\": 0.01356447059605351\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.15466262319939347,\n \
\ \"acc_stderr\": 0.009959786220917213\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/gemoy-4b-instruct-scientific
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-10-43.531199.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- '**/details_harness|winogrande|5_2024-03-10T11-10-43.531199.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-10-43.531199.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_10_43.531199
path:
- results_2024-03-10T11-10-43.531199.parquet
- split: latest
path:
- results_2024-03-10T11-10-43.531199.parquet
---
# Dataset Card for Evaluation run of frankenmerger/gemoy-4b-instruct-scientific
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/gemoy-4b-instruct-scientific](https://huggingface.co/frankenmerger/gemoy-4b-instruct-scientific) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:10:43.531199](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__gemoy-4b-instruct-scientific/blob/main/results_2024-03-10T11-10-43.531199.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.3887480937200939,
"acc_stderr": 0.033967527013847434,
"acc_norm": 0.3919353670094879,
"acc_norm_stderr": 0.03472007289325813,
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.4195962328166831,
"mc2_stderr": 0.014414337460874078
},
"harness|arc:challenge|25": {
"acc": 0.39334470989761094,
"acc_stderr": 0.014275101465693026,
"acc_norm": 0.4197952218430034,
"acc_norm_stderr": 0.014422181226303026
},
"harness|hellaswag|10": {
"acc": 0.46106353316072496,
"acc_stderr": 0.004974628903829138,
"acc_norm": 0.6304521011750648,
"acc_norm_stderr": 0.004816958817726085
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.42962962962962964,
"acc_stderr": 0.04276349494376599,
"acc_norm": 0.42962962962962964,
"acc_norm_stderr": 0.04276349494376599
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3092105263157895,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.3092105263157895,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.45,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.45,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.3849056603773585,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.3849056603773585,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.4166666666666667,
"acc_stderr": 0.041227287076512825,
"acc_norm": 0.4166666666666667,
"acc_norm_stderr": 0.041227287076512825
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768077,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768077
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.3468208092485549,
"acc_stderr": 0.03629146670159663,
"acc_norm": 0.3468208092485549,
"acc_norm_stderr": 0.03629146670159663
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.22549019607843138,
"acc_stderr": 0.041583075330832865,
"acc_norm": 0.22549019607843138,
"acc_norm_stderr": 0.041583075330832865
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956911,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956911
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.32340425531914896,
"acc_stderr": 0.03057944277361034,
"acc_norm": 0.32340425531914896,
"acc_norm_stderr": 0.03057944277361034
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2543859649122807,
"acc_stderr": 0.040969851398436716,
"acc_norm": 0.2543859649122807,
"acc_norm_stderr": 0.040969851398436716
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.2566137566137566,
"acc_stderr": 0.022494510767503154,
"acc_norm": 0.2566137566137566,
"acc_norm_stderr": 0.022494510767503154
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2619047619047619,
"acc_stderr": 0.03932537680392871,
"acc_norm": 0.2619047619047619,
"acc_norm_stderr": 0.03932537680392871
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.38064516129032255,
"acc_stderr": 0.02762171783290703,
"acc_norm": 0.38064516129032255,
"acc_norm_stderr": 0.02762171783290703
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.26108374384236455,
"acc_stderr": 0.030903796952114482,
"acc_norm": 0.26108374384236455,
"acc_norm_stderr": 0.030903796952114482
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145633,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145633
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.4727272727272727,
"acc_stderr": 0.03898531605579419,
"acc_norm": 0.4727272727272727,
"acc_norm_stderr": 0.03898531605579419
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.43434343434343436,
"acc_stderr": 0.03531505879359183,
"acc_norm": 0.43434343434343436,
"acc_norm_stderr": 0.03531505879359183
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.47668393782383417,
"acc_stderr": 0.03604513672442207,
"acc_norm": 0.47668393782383417,
"acc_norm_stderr": 0.03604513672442207
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.38461538461538464,
"acc_stderr": 0.024666744915187222,
"acc_norm": 0.38461538461538464,
"acc_norm_stderr": 0.024666744915187222
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.3277310924369748,
"acc_stderr": 0.030489911417673227,
"acc_norm": 0.3277310924369748,
"acc_norm_stderr": 0.030489911417673227
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.03511807571804725,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.03511807571804725
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.45688073394495415,
"acc_stderr": 0.021357458785226206,
"acc_norm": 0.45688073394495415,
"acc_norm_stderr": 0.021357458785226206
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.028765111718046934,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.028765111718046934
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.034542365853806094,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.034542365853806094
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.5358649789029536,
"acc_stderr": 0.03246338898055659,
"acc_norm": 0.5358649789029536,
"acc_norm_stderr": 0.03246338898055659
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.42152466367713004,
"acc_stderr": 0.033141902221106564,
"acc_norm": 0.42152466367713004,
"acc_norm_stderr": 0.033141902221106564
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.45038167938931295,
"acc_stderr": 0.04363643698524779,
"acc_norm": 0.45038167938931295,
"acc_norm_stderr": 0.04363643698524779
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.5619834710743802,
"acc_stderr": 0.04529146804435792,
"acc_norm": 0.5619834710743802,
"acc_norm_stderr": 0.04529146804435792
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.04830366024635331,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.04830366024635331
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.44171779141104295,
"acc_stderr": 0.039015918258361836,
"acc_norm": 0.44171779141104295,
"acc_norm_stderr": 0.039015918258361836
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.32142857142857145,
"acc_stderr": 0.04432804055291519,
"acc_norm": 0.32142857142857145,
"acc_norm_stderr": 0.04432804055291519
},
"harness|hendrycksTest-management|5": {
"acc": 0.5145631067961165,
"acc_stderr": 0.049486373240266356,
"acc_norm": 0.5145631067961165,
"acc_norm_stderr": 0.049486373240266356
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.6367521367521367,
"acc_stderr": 0.03150712523091264,
"acc_norm": 0.6367521367521367,
"acc_norm_stderr": 0.03150712523091264
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695235,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695235
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.508301404853129,
"acc_stderr": 0.017877498991072008,
"acc_norm": 0.508301404853129,
"acc_norm_stderr": 0.017877498991072008
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.4046242774566474,
"acc_stderr": 0.026424816594009852,
"acc_norm": 0.4046242774566474,
"acc_norm_stderr": 0.026424816594009852
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2536312849162011,
"acc_stderr": 0.014551553659369923,
"acc_norm": 0.2536312849162011,
"acc_norm_stderr": 0.014551553659369923
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.41830065359477125,
"acc_stderr": 0.028245134024387282,
"acc_norm": 0.41830065359477125,
"acc_norm_stderr": 0.028245134024387282
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.3954983922829582,
"acc_stderr": 0.027770918531427834,
"acc_norm": 0.3954983922829582,
"acc_norm_stderr": 0.027770918531427834
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.027237415094592477,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.027237415094592477
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.31560283687943264,
"acc_stderr": 0.027724989449509314,
"acc_norm": 0.31560283687943264,
"acc_norm_stderr": 0.027724989449509314
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3005215123859192,
"acc_stderr": 0.011709918883039122,
"acc_norm": 0.3005215123859192,
"acc_norm_stderr": 0.011709918883039122
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.22058823529411764,
"acc_stderr": 0.025187786660227272,
"acc_norm": 0.22058823529411764,
"acc_norm_stderr": 0.025187786660227272
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.3709150326797386,
"acc_stderr": 0.019542101564854118,
"acc_norm": 0.3709150326797386,
"acc_norm_stderr": 0.019542101564854118
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.44545454545454544,
"acc_stderr": 0.047605488214603246,
"acc_norm": 0.44545454545454544,
"acc_norm_stderr": 0.047605488214603246
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.37551020408163266,
"acc_stderr": 0.031001209039894836,
"acc_norm": 0.37551020408163266,
"acc_norm_stderr": 0.031001209039894836
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.5174129353233831,
"acc_stderr": 0.03533389234739245,
"acc_norm": 0.5174129353233831,
"acc_norm_stderr": 0.03533389234739245
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4397590361445783,
"acc_stderr": 0.03864139923699122,
"acc_norm": 0.4397590361445783,
"acc_norm_stderr": 0.03864139923699122
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.5146198830409356,
"acc_stderr": 0.038331852752130254,
"acc_norm": 0.5146198830409356,
"acc_norm_stderr": 0.038331852752130254
},
"harness|truthfulqa:mc|0": {
"mc1": 0.26438188494492043,
"mc1_stderr": 0.015438211119522514,
"mc2": 0.4195962328166831,
"mc2_stderr": 0.014414337460874078
},
"harness|winogrande|5": {
"acc": 0.6306235201262825,
"acc_stderr": 0.01356447059605351
},
"harness|gsm8k|5": {
"acc": 0.15466262319939347,
"acc_stderr": 0.009959786220917213
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
felipesampaio2010/stevestotch | felipesampaio2010 | "2024-03-10T11:13:52Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T11:13:21Z" | ---
license: openrail
---
|
udmurtNLP/udmurt-russian-english-labse | udmurtNLP | "2024-03-26T09:29:37Z" | 0 | 2 | [
"language:udm",
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:21:15Z" | ---
dataset_info:
features:
- name: rus
dtype: string
- name: udm
dtype: string
- name: eng
dtype: string
splits:
- name: train
num_bytes: 18829292
num_examples: 40365
download_size: 9006599
dataset_size: 18829292
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
language:
- udm
size_categories:
- 10K<n<100K
---
# Udmurt-Russian-English parallel corpora for LaBSE training
English columns was translated by Yandex.Translator API from russian.
Please note that the last 250 rows in the dataset are taken from FLORES-250, so if you want to train a machine translator, remove these rows. |
BambiMC/ts_train | BambiMC | "2024-03-10T11:30:59Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:30:38Z" | ---
license: mit
---
|
taufiqdp/Indo4B-Plus | taufiqdp | "2024-03-13T07:19:16Z" | 0 | 0 | [
"language:id",
"license:mit",
"size_categories:10M<n<100M",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T11:31:12Z" | ---
language:
- id
license: mit
---
|
knlp/hansol-qa_base | knlp | "2024-03-10T11:33:10Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:32:49Z" | ---
license: apache-2.0
dataset_info:
features:
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 3210110
num_examples: 6440
download_size: 577722
dataset_size: 3210110
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_frankenmerger__delta-4B-super | open-llm-leaderboard-old | "2024-03-10T11:35:37Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:35:16Z" | ---
pretty_name: Evaluation run of frankenmerger/delta-4B-super
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/delta-4B-super](https://huggingface.co/frankenmerger/delta-4B-super)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__delta-4B-super\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:33:31.513769](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4B-super/blob/main/results_2024-03-10T11-33-31.513769.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5905528687838363,\n\
\ \"acc_stderr\": 0.03352325174942067,\n \"acc_norm\": 0.5934267748828804,\n\
\ \"acc_norm_stderr\": 0.034203989694081526,\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5173591737183307,\n\
\ \"mc2_stderr\": 0.016143767707448048\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5674061433447098,\n \"acc_stderr\": 0.014478005694182526,\n\
\ \"acc_norm\": 0.5861774744027304,\n \"acc_norm_stderr\": 0.014392730009221004\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5875323640709023,\n\
\ \"acc_stderr\": 0.0049127238489447955,\n \"acc_norm\": 0.7628958374825732,\n\
\ \"acc_norm_stderr\": 0.004244374809273614\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.04605661864718381,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.04605661864718381\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5185185185185185,\n\
\ \"acc_stderr\": 0.043163785995113245,\n \"acc_norm\": 0.5185185185185185,\n\
\ \"acc_norm_stderr\": 0.043163785995113245\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.618421052631579,\n \"acc_stderr\": 0.03953173377749194,\n\
\ \"acc_norm\": 0.618421052631579,\n \"acc_norm_stderr\": 0.03953173377749194\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.66,\n\
\ \"acc_stderr\": 0.04760952285695238,\n \"acc_norm\": 0.66,\n \
\ \"acc_norm_stderr\": 0.04760952285695238\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6264150943396226,\n \"acc_stderr\": 0.02977308271331987,\n\
\ \"acc_norm\": 0.6264150943396226,\n \"acc_norm_stderr\": 0.02977308271331987\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.6458333333333334,\n\
\ \"acc_stderr\": 0.039994111357535424,\n \"acc_norm\": 0.6458333333333334,\n\
\ \"acc_norm_stderr\": 0.039994111357535424\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \"acc_norm\": 0.43,\n\
\ \"acc_norm_stderr\": 0.049756985195624284\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.39215686274509803,\n \"acc_stderr\": 0.048580835742663434,\n\
\ \"acc_norm\": 0.39215686274509803,\n \"acc_norm_stderr\": 0.048580835742663434\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.68,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.68,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5063829787234042,\n \"acc_stderr\": 0.032683358999363366,\n\
\ \"acc_norm\": 0.5063829787234042,\n \"acc_norm_stderr\": 0.032683358999363366\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.35964912280701755,\n\
\ \"acc_stderr\": 0.045144961328736334,\n \"acc_norm\": 0.35964912280701755,\n\
\ \"acc_norm_stderr\": 0.045144961328736334\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5448275862068965,\n \"acc_stderr\": 0.04149886942192117,\n\
\ \"acc_norm\": 0.5448275862068965,\n \"acc_norm_stderr\": 0.04149886942192117\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.4470899470899471,\n \"acc_stderr\": 0.025606723995777025,\n \"\
acc_norm\": 0.4470899470899471,\n \"acc_norm_stderr\": 0.025606723995777025\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.48412698412698413,\n\
\ \"acc_stderr\": 0.04469881854072606,\n \"acc_norm\": 0.48412698412698413,\n\
\ \"acc_norm_stderr\": 0.04469881854072606\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7096774193548387,\n\
\ \"acc_stderr\": 0.025822106119415895,\n \"acc_norm\": 0.7096774193548387,\n\
\ \"acc_norm_stderr\": 0.025822106119415895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4630541871921182,\n \"acc_stderr\": 0.035083705204426656,\n\
\ \"acc_norm\": 0.4630541871921182,\n \"acc_norm_stderr\": 0.035083705204426656\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.64,\n \"acc_stderr\": 0.048241815132442176,\n \"acc_norm\"\
: 0.64,\n \"acc_norm_stderr\": 0.048241815132442176\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7727272727272727,\n \"acc_stderr\": 0.02985751567338642,\n \"\
acc_norm\": 0.7727272727272727,\n \"acc_norm_stderr\": 0.02985751567338642\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6307692307692307,\n \"acc_stderr\": 0.024468615241478926,\n\
\ \"acc_norm\": 0.6307692307692307,\n \"acc_norm_stderr\": 0.024468615241478926\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.0287420409039485,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.0287420409039485\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6260504201680672,\n \"acc_stderr\": 0.03142946637883708,\n \
\ \"acc_norm\": 0.6260504201680672,\n \"acc_norm_stderr\": 0.03142946637883708\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3509933774834437,\n \"acc_stderr\": 0.03896981964257375,\n \"\
acc_norm\": 0.3509933774834437,\n \"acc_norm_stderr\": 0.03896981964257375\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7963302752293578,\n \"acc_stderr\": 0.01726674208763082,\n \"\
acc_norm\": 0.7963302752293578,\n \"acc_norm_stderr\": 0.01726674208763082\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.49074074074074076,\n \"acc_stderr\": 0.03409386946992699,\n \"\
acc_norm\": 0.49074074074074076,\n \"acc_norm_stderr\": 0.03409386946992699\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7450980392156863,\n \"acc_stderr\": 0.03058759135160425,\n \"\
acc_norm\": 0.7450980392156863,\n \"acc_norm_stderr\": 0.03058759135160425\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7426160337552743,\n \"acc_stderr\": 0.02845882099146029,\n \
\ \"acc_norm\": 0.7426160337552743,\n \"acc_norm_stderr\": 0.02845882099146029\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7022900763358778,\n \"acc_stderr\": 0.040103589424622034,\n\
\ \"acc_norm\": 0.7022900763358778,\n \"acc_norm_stderr\": 0.040103589424622034\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070417,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070417\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7870370370370371,\n\
\ \"acc_stderr\": 0.0395783547198098,\n \"acc_norm\": 0.7870370370370371,\n\
\ \"acc_norm_stderr\": 0.0395783547198098\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7423312883435583,\n \"acc_stderr\": 0.03436150827846917,\n\
\ \"acc_norm\": 0.7423312883435583,\n \"acc_norm_stderr\": 0.03436150827846917\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4375,\n\
\ \"acc_stderr\": 0.04708567521880525,\n \"acc_norm\": 0.4375,\n \
\ \"acc_norm_stderr\": 0.04708567521880525\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7475728155339806,\n \"acc_stderr\": 0.04301250399690878,\n\
\ \"acc_norm\": 0.7475728155339806,\n \"acc_norm_stderr\": 0.04301250399690878\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.7948717948717948,\n\
\ \"acc_stderr\": 0.026453508054040332,\n \"acc_norm\": 0.7948717948717948,\n\
\ \"acc_norm_stderr\": 0.026453508054040332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.63,\n \"acc_stderr\": 0.048523658709391,\n \
\ \"acc_norm\": 0.63,\n \"acc_norm_stderr\": 0.048523658709391\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6909323116219668,\n\
\ \"acc_stderr\": 0.016524988919702208,\n \"acc_norm\": 0.6909323116219668,\n\
\ \"acc_norm_stderr\": 0.016524988919702208\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.024946792225272314,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.024946792225272314\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n\
\ \"acc_stderr\": 0.014054314935614562,\n \"acc_norm\": 0.22905027932960895,\n\
\ \"acc_norm_stderr\": 0.014054314935614562\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6241830065359477,\n \"acc_stderr\": 0.02773283435336394,\n\
\ \"acc_norm\": 0.6241830065359477,\n \"acc_norm_stderr\": 0.02773283435336394\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.02685882587948854,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.02685882587948854\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6358024691358025,\n \"acc_stderr\": 0.026774929899722324,\n\
\ \"acc_norm\": 0.6358024691358025,\n \"acc_norm_stderr\": 0.026774929899722324\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.450354609929078,\n \"acc_stderr\": 0.029680105565029036,\n \
\ \"acc_norm\": 0.450354609929078,\n \"acc_norm_stderr\": 0.029680105565029036\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.42698826597131684,\n\
\ \"acc_stderr\": 0.012633353557534427,\n \"acc_norm\": 0.42698826597131684,\n\
\ \"acc_norm_stderr\": 0.012633353557534427\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5257352941176471,\n \"acc_stderr\": 0.030332578094555026,\n\
\ \"acc_norm\": 0.5257352941176471,\n \"acc_norm_stderr\": 0.030332578094555026\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5866013071895425,\n \"acc_stderr\": 0.019922115682786685,\n \
\ \"acc_norm\": 0.5866013071895425,\n \"acc_norm_stderr\": 0.019922115682786685\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6909090909090909,\n\
\ \"acc_stderr\": 0.044262946482000985,\n \"acc_norm\": 0.6909090909090909,\n\
\ \"acc_norm_stderr\": 0.044262946482000985\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.673469387755102,\n \"acc_stderr\": 0.03002105623844031,\n\
\ \"acc_norm\": 0.673469387755102,\n \"acc_norm_stderr\": 0.03002105623844031\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7761194029850746,\n\
\ \"acc_stderr\": 0.029475250236017183,\n \"acc_norm\": 0.7761194029850746,\n\
\ \"acc_norm_stderr\": 0.029475250236017183\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.75,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4759036144578313,\n\
\ \"acc_stderr\": 0.038879718495972646,\n \"acc_norm\": 0.4759036144578313,\n\
\ \"acc_norm_stderr\": 0.038879718495972646\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6783625730994152,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.6783625730994152,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3684210526315789,\n\
\ \"mc1_stderr\": 0.016886551261046042,\n \"mc2\": 0.5173591737183307,\n\
\ \"mc2_stderr\": 0.016143767707448048\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7363851617995264,\n \"acc_stderr\": 0.012382849299658468\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.46929492039423804,\n \
\ \"acc_stderr\": 0.013746490739560042\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/delta-4B-super
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-33-31.513769.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-33-31.513769.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- '**/details_harness|winogrande|5_2024-03-10T11-33-31.513769.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-33-31.513769.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_33_31.513769
path:
- results_2024-03-10T11-33-31.513769.parquet
- split: latest
path:
- results_2024-03-10T11-33-31.513769.parquet
---
# Dataset Card for Evaluation run of frankenmerger/delta-4B-super
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/delta-4B-super](https://huggingface.co/frankenmerger/delta-4B-super) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__delta-4B-super",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:33:31.513769](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4B-super/blob/main/results_2024-03-10T11-33-31.513769.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5905528687838363,
"acc_stderr": 0.03352325174942067,
"acc_norm": 0.5934267748828804,
"acc_norm_stderr": 0.034203989694081526,
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5173591737183307,
"mc2_stderr": 0.016143767707448048
},
"harness|arc:challenge|25": {
"acc": 0.5674061433447098,
"acc_stderr": 0.014478005694182526,
"acc_norm": 0.5861774744027304,
"acc_norm_stderr": 0.014392730009221004
},
"harness|hellaswag|10": {
"acc": 0.5875323640709023,
"acc_stderr": 0.0049127238489447955,
"acc_norm": 0.7628958374825732,
"acc_norm_stderr": 0.004244374809273614
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.3,
"acc_stderr": 0.04605661864718381,
"acc_norm": 0.3,
"acc_norm_stderr": 0.04605661864718381
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5185185185185185,
"acc_stderr": 0.043163785995113245,
"acc_norm": 0.5185185185185185,
"acc_norm_stderr": 0.043163785995113245
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.618421052631579,
"acc_stderr": 0.03953173377749194,
"acc_norm": 0.618421052631579,
"acc_norm_stderr": 0.03953173377749194
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695238,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695238
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6264150943396226,
"acc_stderr": 0.02977308271331987,
"acc_norm": 0.6264150943396226,
"acc_norm_stderr": 0.02977308271331987
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.6458333333333334,
"acc_stderr": 0.039994111357535424,
"acc_norm": 0.6458333333333334,
"acc_norm_stderr": 0.039994111357535424
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.39215686274509803,
"acc_stderr": 0.048580835742663434,
"acc_norm": 0.39215686274509803,
"acc_norm_stderr": 0.048580835742663434
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.68,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.68,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5063829787234042,
"acc_stderr": 0.032683358999363366,
"acc_norm": 0.5063829787234042,
"acc_norm_stderr": 0.032683358999363366
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.35964912280701755,
"acc_stderr": 0.045144961328736334,
"acc_norm": 0.35964912280701755,
"acc_norm_stderr": 0.045144961328736334
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5448275862068965,
"acc_stderr": 0.04149886942192117,
"acc_norm": 0.5448275862068965,
"acc_norm_stderr": 0.04149886942192117
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.4470899470899471,
"acc_stderr": 0.025606723995777025,
"acc_norm": 0.4470899470899471,
"acc_norm_stderr": 0.025606723995777025
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.48412698412698413,
"acc_stderr": 0.04469881854072606,
"acc_norm": 0.48412698412698413,
"acc_norm_stderr": 0.04469881854072606
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7096774193548387,
"acc_stderr": 0.025822106119415895,
"acc_norm": 0.7096774193548387,
"acc_norm_stderr": 0.025822106119415895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4630541871921182,
"acc_stderr": 0.035083705204426656,
"acc_norm": 0.4630541871921182,
"acc_norm_stderr": 0.035083705204426656
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.64,
"acc_stderr": 0.048241815132442176,
"acc_norm": 0.64,
"acc_norm_stderr": 0.048241815132442176
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7727272727272727,
"acc_stderr": 0.02985751567338642,
"acc_norm": 0.7727272727272727,
"acc_norm_stderr": 0.02985751567338642
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6307692307692307,
"acc_stderr": 0.024468615241478926,
"acc_norm": 0.6307692307692307,
"acc_norm_stderr": 0.024468615241478926
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.0287420409039485,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.0287420409039485
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6260504201680672,
"acc_stderr": 0.03142946637883708,
"acc_norm": 0.6260504201680672,
"acc_norm_stderr": 0.03142946637883708
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3509933774834437,
"acc_stderr": 0.03896981964257375,
"acc_norm": 0.3509933774834437,
"acc_norm_stderr": 0.03896981964257375
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7963302752293578,
"acc_stderr": 0.01726674208763082,
"acc_norm": 0.7963302752293578,
"acc_norm_stderr": 0.01726674208763082
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.49074074074074076,
"acc_stderr": 0.03409386946992699,
"acc_norm": 0.49074074074074076,
"acc_norm_stderr": 0.03409386946992699
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7450980392156863,
"acc_stderr": 0.03058759135160425,
"acc_norm": 0.7450980392156863,
"acc_norm_stderr": 0.03058759135160425
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.02845882099146029,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.02845882099146029
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7022900763358778,
"acc_stderr": 0.040103589424622034,
"acc_norm": 0.7022900763358778,
"acc_norm_stderr": 0.040103589424622034
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070417,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070417
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7870370370370371,
"acc_stderr": 0.0395783547198098,
"acc_norm": 0.7870370370370371,
"acc_norm_stderr": 0.0395783547198098
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7423312883435583,
"acc_stderr": 0.03436150827846917,
"acc_norm": 0.7423312883435583,
"acc_norm_stderr": 0.03436150827846917
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4375,
"acc_stderr": 0.04708567521880525,
"acc_norm": 0.4375,
"acc_norm_stderr": 0.04708567521880525
},
"harness|hendrycksTest-management|5": {
"acc": 0.7475728155339806,
"acc_stderr": 0.04301250399690878,
"acc_norm": 0.7475728155339806,
"acc_norm_stderr": 0.04301250399690878
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.7948717948717948,
"acc_stderr": 0.026453508054040332,
"acc_norm": 0.7948717948717948,
"acc_norm_stderr": 0.026453508054040332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6909323116219668,
"acc_stderr": 0.016524988919702208,
"acc_norm": 0.6909323116219668,
"acc_norm_stderr": 0.016524988919702208
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.024946792225272314,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.024946792225272314
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.014054314935614562,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.014054314935614562
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6241830065359477,
"acc_stderr": 0.02773283435336394,
"acc_norm": 0.6241830065359477,
"acc_norm_stderr": 0.02773283435336394
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.02685882587948854,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.02685882587948854
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6358024691358025,
"acc_stderr": 0.026774929899722324,
"acc_norm": 0.6358024691358025,
"acc_norm_stderr": 0.026774929899722324
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.450354609929078,
"acc_stderr": 0.029680105565029036,
"acc_norm": 0.450354609929078,
"acc_norm_stderr": 0.029680105565029036
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.42698826597131684,
"acc_stderr": 0.012633353557534427,
"acc_norm": 0.42698826597131684,
"acc_norm_stderr": 0.012633353557534427
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5257352941176471,
"acc_stderr": 0.030332578094555026,
"acc_norm": 0.5257352941176471,
"acc_norm_stderr": 0.030332578094555026
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5866013071895425,
"acc_stderr": 0.019922115682786685,
"acc_norm": 0.5866013071895425,
"acc_norm_stderr": 0.019922115682786685
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6909090909090909,
"acc_stderr": 0.044262946482000985,
"acc_norm": 0.6909090909090909,
"acc_norm_stderr": 0.044262946482000985
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.673469387755102,
"acc_stderr": 0.03002105623844031,
"acc_norm": 0.673469387755102,
"acc_norm_stderr": 0.03002105623844031
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7761194029850746,
"acc_stderr": 0.029475250236017183,
"acc_norm": 0.7761194029850746,
"acc_norm_stderr": 0.029475250236017183
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4759036144578313,
"acc_stderr": 0.038879718495972646,
"acc_norm": 0.4759036144578313,
"acc_norm_stderr": 0.038879718495972646
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6783625730994152,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.6783625730994152,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3684210526315789,
"mc1_stderr": 0.016886551261046042,
"mc2": 0.5173591737183307,
"mc2_stderr": 0.016143767707448048
},
"harness|winogrande|5": {
"acc": 0.7363851617995264,
"acc_stderr": 0.012382849299658468
},
"harness|gsm8k|5": {
"acc": 0.46929492039423804,
"acc_stderr": 0.013746490739560042
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
DeliberatorArchiver/hls_streaming_media | DeliberatorArchiver | "2024-04-05T08:41:14Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:36:08Z" | ---
viewer: false
--- |
open-llm-leaderboard-old/details_frankenmerger__cosmo-3b-test | open-llm-leaderboard-old | "2024-03-10T11:37:20Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:36:59Z" | ---
pretty_name: Evaluation run of frankenmerger/cosmo-3b-test
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/cosmo-3b-test](https://huggingface.co/frankenmerger/cosmo-3b-test)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__cosmo-3b-test\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:35:15.251120](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__cosmo-3b-test/blob/main/results_2024-03-10T11-35-15.251120.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.2759591596623969,\n\
\ \"acc_stderr\": 0.03167318249653025,\n \"acc_norm\": 0.2781335496872269,\n\
\ \"acc_norm_stderr\": 0.03246050452019378,\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474207,\n \"mc2\": 0.39020128384630975,\n\
\ \"mc2_stderr\": 0.014969681174597876\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.3293515358361775,\n \"acc_stderr\": 0.013734057652635474,\n\
\ \"acc_norm\": 0.3532423208191126,\n \"acc_norm_stderr\": 0.013967822714840055\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4144592710615415,\n\
\ \"acc_stderr\": 0.004916216503770342,\n \"acc_norm\": 0.5236008763194583,\n\
\ \"acc_norm_stderr\": 0.004984219681732656\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.24444444444444444,\n\
\ \"acc_stderr\": 0.03712537833614867,\n \"acc_norm\": 0.24444444444444444,\n\
\ \"acc_norm_stderr\": 0.03712537833614867\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.3157894736842105,\n \"acc_stderr\": 0.037827289808654685,\n\
\ \"acc_norm\": 0.3157894736842105,\n \"acc_norm_stderr\": 0.037827289808654685\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.19,\n\
\ \"acc_stderr\": 0.03942772444036623,\n \"acc_norm\": 0.19,\n \
\ \"acc_norm_stderr\": 0.03942772444036623\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2339622641509434,\n \"acc_stderr\": 0.026055296901152915,\n\
\ \"acc_norm\": 0.2339622641509434,\n \"acc_norm_stderr\": 0.026055296901152915\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.2569444444444444,\n\
\ \"acc_stderr\": 0.03653946969442099,\n \"acc_norm\": 0.2569444444444444,\n\
\ \"acc_norm_stderr\": 0.03653946969442099\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.22,\n \"acc_stderr\": 0.0416333199893227,\n \"acc_norm\": 0.22,\n\
\ \"acc_norm_stderr\": 0.0416333199893227\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.30057803468208094,\n\
\ \"acc_stderr\": 0.0349610148119118,\n \"acc_norm\": 0.30057803468208094,\n\
\ \"acc_norm_stderr\": 0.0349610148119118\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3137254901960784,\n \"acc_stderr\": 0.04617034827006718,\n\
\ \"acc_norm\": 0.3137254901960784,\n \"acc_norm_stderr\": 0.04617034827006718\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.04688261722621503,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.04688261722621503\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2297872340425532,\n \"acc_stderr\": 0.027501752944412417,\n\
\ \"acc_norm\": 0.2297872340425532,\n \"acc_norm_stderr\": 0.027501752944412417\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2719298245614035,\n\
\ \"acc_stderr\": 0.04185774424022056,\n \"acc_norm\": 0.2719298245614035,\n\
\ \"acc_norm_stderr\": 0.04185774424022056\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2896551724137931,\n \"acc_stderr\": 0.03780019230438014,\n\
\ \"acc_norm\": 0.2896551724137931,\n \"acc_norm_stderr\": 0.03780019230438014\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.31216931216931215,\n \"acc_stderr\": 0.023865206836972602,\n \"\
acc_norm\": 0.31216931216931215,\n \"acc_norm_stderr\": 0.023865206836972602\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.19047619047619047,\n\
\ \"acc_stderr\": 0.03512207412302051,\n \"acc_norm\": 0.19047619047619047,\n\
\ \"acc_norm_stderr\": 0.03512207412302051\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2806451612903226,\n\
\ \"acc_stderr\": 0.025560604721022895,\n \"acc_norm\": 0.2806451612903226,\n\
\ \"acc_norm_stderr\": 0.025560604721022895\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.30049261083743845,\n \"acc_stderr\": 0.03225799476233485,\n\
\ \"acc_norm\": 0.30049261083743845,\n \"acc_norm_stderr\": 0.03225799476233485\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\"\
: 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.2727272727272727,\n \"acc_stderr\": 0.03477691162163659,\n\
\ \"acc_norm\": 0.2727272727272727,\n \"acc_norm_stderr\": 0.03477691162163659\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.3181818181818182,\n \"acc_stderr\": 0.033184773338453315,\n \"\
acc_norm\": 0.3181818181818182,\n \"acc_norm_stderr\": 0.033184773338453315\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.20207253886010362,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.20207253886010362,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.31794871794871793,\n \"acc_stderr\": 0.02361088430892786,\n\
\ \"acc_norm\": 0.31794871794871793,\n \"acc_norm_stderr\": 0.02361088430892786\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.28888888888888886,\n \"acc_stderr\": 0.027634907264178544,\n \
\ \"acc_norm\": 0.28888888888888886,\n \"acc_norm_stderr\": 0.027634907264178544\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.33613445378151263,\n \"acc_stderr\": 0.03068473711513536,\n\
\ \"acc_norm\": 0.33613445378151263,\n \"acc_norm_stderr\": 0.03068473711513536\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.26490066225165565,\n \"acc_stderr\": 0.03603038545360384,\n \"\
acc_norm\": 0.26490066225165565,\n \"acc_norm_stderr\": 0.03603038545360384\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.27522935779816515,\n \"acc_stderr\": 0.0191490937431552,\n \"\
acc_norm\": 0.27522935779816515,\n \"acc_norm_stderr\": 0.0191490937431552\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.2916666666666667,\n \"acc_stderr\": 0.030998666304560534,\n \"\
acc_norm\": 0.2916666666666667,\n \"acc_norm_stderr\": 0.030998666304560534\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.2549019607843137,\n \"acc_stderr\": 0.030587591351604236,\n \"\
acc_norm\": 0.2549019607843137,\n \"acc_norm_stderr\": 0.030587591351604236\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.2742616033755274,\n \"acc_stderr\": 0.029041333510598028,\n \
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.029041333510598028\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.17040358744394618,\n\
\ \"acc_stderr\": 0.02523459344713616,\n \"acc_norm\": 0.17040358744394618,\n\
\ \"acc_norm_stderr\": 0.02523459344713616\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.3053435114503817,\n \"acc_stderr\": 0.04039314978724561,\n\
\ \"acc_norm\": 0.3053435114503817,\n \"acc_norm_stderr\": 0.04039314978724561\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.32231404958677684,\n \"acc_stderr\": 0.04266416363352168,\n \"\
acc_norm\": 0.32231404958677684,\n \"acc_norm_stderr\": 0.04266416363352168\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.24074074074074073,\n\
\ \"acc_stderr\": 0.04133119440243839,\n \"acc_norm\": 0.24074074074074073,\n\
\ \"acc_norm_stderr\": 0.04133119440243839\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.2085889570552147,\n \"acc_stderr\": 0.03192193448934725,\n\
\ \"acc_norm\": 0.2085889570552147,\n \"acc_norm_stderr\": 0.03192193448934725\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.22321428571428573,\n\
\ \"acc_stderr\": 0.039523019677025116,\n \"acc_norm\": 0.22321428571428573,\n\
\ \"acc_norm_stderr\": 0.039523019677025116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.3300970873786408,\n \"acc_stderr\": 0.04656147110012352,\n\
\ \"acc_norm\": 0.3300970873786408,\n \"acc_norm_stderr\": 0.04656147110012352\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.24786324786324787,\n\
\ \"acc_stderr\": 0.028286324075564393,\n \"acc_norm\": 0.24786324786324787,\n\
\ \"acc_norm_stderr\": 0.028286324075564393\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.04688261722621504,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.04688261722621504\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398698,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398698\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.2832369942196532,\n \"acc_stderr\": 0.02425790170532337,\n\
\ \"acc_norm\": 0.2832369942196532,\n \"acc_norm_stderr\": 0.02425790170532337\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.2446927374301676,\n\
\ \"acc_stderr\": 0.014378169884098405,\n \"acc_norm\": 0.2446927374301676,\n\
\ \"acc_norm_stderr\": 0.014378169884098405\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24183006535947713,\n \"acc_stderr\": 0.024518195641879334,\n\
\ \"acc_norm\": 0.24183006535947713,\n \"acc_norm_stderr\": 0.024518195641879334\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2540192926045016,\n\
\ \"acc_stderr\": 0.024723861504771686,\n \"acc_norm\": 0.2540192926045016,\n\
\ \"acc_norm_stderr\": 0.024723861504771686\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.28703703703703703,\n \"acc_stderr\": 0.025171041915309684,\n\
\ \"acc_norm\": 0.28703703703703703,\n \"acc_norm_stderr\": 0.025171041915309684\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.24113475177304963,\n \"acc_stderr\": 0.025518731049537773,\n \
\ \"acc_norm\": 0.24113475177304963,\n \"acc_norm_stderr\": 0.025518731049537773\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.2666232073011734,\n\
\ \"acc_stderr\": 0.011293836031612145,\n \"acc_norm\": 0.2666232073011734,\n\
\ \"acc_norm_stderr\": 0.011293836031612145\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.21691176470588236,\n \"acc_stderr\": 0.02503584522771125,\n\
\ \"acc_norm\": 0.21691176470588236,\n \"acc_norm_stderr\": 0.02503584522771125\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2630718954248366,\n \"acc_stderr\": 0.017812676542320657,\n \
\ \"acc_norm\": 0.2630718954248366,\n \"acc_norm_stderr\": 0.017812676542320657\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.3181818181818182,\n\
\ \"acc_stderr\": 0.04461272175910509,\n \"acc_norm\": 0.3181818181818182,\n\
\ \"acc_norm_stderr\": 0.04461272175910509\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.2816326530612245,\n \"acc_stderr\": 0.02879518557429129,\n\
\ \"acc_norm\": 0.2816326530612245,\n \"acc_norm_stderr\": 0.02879518557429129\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.2885572139303483,\n\
\ \"acc_stderr\": 0.03203841040213321,\n \"acc_norm\": 0.2885572139303483,\n\
\ \"acc_norm_stderr\": 0.03203841040213321\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.28313253012048195,\n\
\ \"acc_stderr\": 0.03507295431370519,\n \"acc_norm\": 0.28313253012048195,\n\
\ \"acc_norm_stderr\": 0.03507295431370519\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.22807017543859648,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.22807017543859648,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22399020807833536,\n\
\ \"mc1_stderr\": 0.014594964329474207,\n \"mc2\": 0.39020128384630975,\n\
\ \"mc2_stderr\": 0.014969681174597876\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.5430149960536701,\n \"acc_stderr\": 0.01400038676159829\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.013646702047005308,\n \
\ \"acc_stderr\": 0.0031957470754807923\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/cosmo-3b-test
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-35-15.251120.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-35-15.251120.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- '**/details_harness|winogrande|5_2024-03-10T11-35-15.251120.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-35-15.251120.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_35_15.251120
path:
- results_2024-03-10T11-35-15.251120.parquet
- split: latest
path:
- results_2024-03-10T11-35-15.251120.parquet
---
# Dataset Card for Evaluation run of frankenmerger/cosmo-3b-test
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/cosmo-3b-test](https://huggingface.co/frankenmerger/cosmo-3b-test) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__cosmo-3b-test",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:35:15.251120](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__cosmo-3b-test/blob/main/results_2024-03-10T11-35-15.251120.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.2759591596623969,
"acc_stderr": 0.03167318249653025,
"acc_norm": 0.2781335496872269,
"acc_norm_stderr": 0.03246050452019378,
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474207,
"mc2": 0.39020128384630975,
"mc2_stderr": 0.014969681174597876
},
"harness|arc:challenge|25": {
"acc": 0.3293515358361775,
"acc_stderr": 0.013734057652635474,
"acc_norm": 0.3532423208191126,
"acc_norm_stderr": 0.013967822714840055
},
"harness|hellaswag|10": {
"acc": 0.4144592710615415,
"acc_stderr": 0.004916216503770342,
"acc_norm": 0.5236008763194583,
"acc_norm_stderr": 0.004984219681732656
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.24444444444444444,
"acc_stderr": 0.03712537833614867,
"acc_norm": 0.24444444444444444,
"acc_norm_stderr": 0.03712537833614867
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.3157894736842105,
"acc_stderr": 0.037827289808654685,
"acc_norm": 0.3157894736842105,
"acc_norm_stderr": 0.037827289808654685
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.19,
"acc_stderr": 0.03942772444036623,
"acc_norm": 0.19,
"acc_norm_stderr": 0.03942772444036623
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2339622641509434,
"acc_stderr": 0.026055296901152915,
"acc_norm": 0.2339622641509434,
"acc_norm_stderr": 0.026055296901152915
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.2569444444444444,
"acc_stderr": 0.03653946969442099,
"acc_norm": 0.2569444444444444,
"acc_norm_stderr": 0.03653946969442099
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.22,
"acc_stderr": 0.0416333199893227,
"acc_norm": 0.22,
"acc_norm_stderr": 0.0416333199893227
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.30057803468208094,
"acc_stderr": 0.0349610148119118,
"acc_norm": 0.30057803468208094,
"acc_norm_stderr": 0.0349610148119118
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3137254901960784,
"acc_stderr": 0.04617034827006718,
"acc_norm": 0.3137254901960784,
"acc_norm_stderr": 0.04617034827006718
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621503,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621503
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2297872340425532,
"acc_stderr": 0.027501752944412417,
"acc_norm": 0.2297872340425532,
"acc_norm_stderr": 0.027501752944412417
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2719298245614035,
"acc_stderr": 0.04185774424022056,
"acc_norm": 0.2719298245614035,
"acc_norm_stderr": 0.04185774424022056
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2896551724137931,
"acc_stderr": 0.03780019230438014,
"acc_norm": 0.2896551724137931,
"acc_norm_stderr": 0.03780019230438014
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.31216931216931215,
"acc_stderr": 0.023865206836972602,
"acc_norm": 0.31216931216931215,
"acc_norm_stderr": 0.023865206836972602
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.19047619047619047,
"acc_stderr": 0.03512207412302051,
"acc_norm": 0.19047619047619047,
"acc_norm_stderr": 0.03512207412302051
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2806451612903226,
"acc_stderr": 0.025560604721022895,
"acc_norm": 0.2806451612903226,
"acc_norm_stderr": 0.025560604721022895
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.30049261083743845,
"acc_stderr": 0.03225799476233485,
"acc_norm": 0.30049261083743845,
"acc_norm_stderr": 0.03225799476233485
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.2727272727272727,
"acc_stderr": 0.03477691162163659,
"acc_norm": 0.2727272727272727,
"acc_norm_stderr": 0.03477691162163659
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.033184773338453315,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.033184773338453315
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.20207253886010362,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.20207253886010362,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.31794871794871793,
"acc_stderr": 0.02361088430892786,
"acc_norm": 0.31794871794871793,
"acc_norm_stderr": 0.02361088430892786
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.28888888888888886,
"acc_stderr": 0.027634907264178544,
"acc_norm": 0.28888888888888886,
"acc_norm_stderr": 0.027634907264178544
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.33613445378151263,
"acc_stderr": 0.03068473711513536,
"acc_norm": 0.33613445378151263,
"acc_norm_stderr": 0.03068473711513536
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.26490066225165565,
"acc_stderr": 0.03603038545360384,
"acc_norm": 0.26490066225165565,
"acc_norm_stderr": 0.03603038545360384
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.27522935779816515,
"acc_stderr": 0.0191490937431552,
"acc_norm": 0.27522935779816515,
"acc_norm_stderr": 0.0191490937431552
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.2916666666666667,
"acc_stderr": 0.030998666304560534,
"acc_norm": 0.2916666666666667,
"acc_norm_stderr": 0.030998666304560534
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.2549019607843137,
"acc_stderr": 0.030587591351604236,
"acc_norm": 0.2549019607843137,
"acc_norm_stderr": 0.030587591351604236
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.029041333510598028,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.029041333510598028
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.17040358744394618,
"acc_stderr": 0.02523459344713616,
"acc_norm": 0.17040358744394618,
"acc_norm_stderr": 0.02523459344713616
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.3053435114503817,
"acc_stderr": 0.04039314978724561,
"acc_norm": 0.3053435114503817,
"acc_norm_stderr": 0.04039314978724561
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.32231404958677684,
"acc_stderr": 0.04266416363352168,
"acc_norm": 0.32231404958677684,
"acc_norm_stderr": 0.04266416363352168
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.24074074074074073,
"acc_stderr": 0.04133119440243839,
"acc_norm": 0.24074074074074073,
"acc_norm_stderr": 0.04133119440243839
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.2085889570552147,
"acc_stderr": 0.03192193448934725,
"acc_norm": 0.2085889570552147,
"acc_norm_stderr": 0.03192193448934725
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.22321428571428573,
"acc_stderr": 0.039523019677025116,
"acc_norm": 0.22321428571428573,
"acc_norm_stderr": 0.039523019677025116
},
"harness|hendrycksTest-management|5": {
"acc": 0.3300970873786408,
"acc_stderr": 0.04656147110012352,
"acc_norm": 0.3300970873786408,
"acc_norm_stderr": 0.04656147110012352
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.24786324786324787,
"acc_stderr": 0.028286324075564393,
"acc_norm": 0.24786324786324787,
"acc_norm_stderr": 0.028286324075564393
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.32,
"acc_stderr": 0.04688261722621504,
"acc_norm": 0.32,
"acc_norm_stderr": 0.04688261722621504
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398698,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398698
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.2832369942196532,
"acc_stderr": 0.02425790170532337,
"acc_norm": 0.2832369942196532,
"acc_norm_stderr": 0.02425790170532337
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.2446927374301676,
"acc_stderr": 0.014378169884098405,
"acc_norm": 0.2446927374301676,
"acc_norm_stderr": 0.014378169884098405
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24183006535947713,
"acc_stderr": 0.024518195641879334,
"acc_norm": 0.24183006535947713,
"acc_norm_stderr": 0.024518195641879334
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2540192926045016,
"acc_stderr": 0.024723861504771686,
"acc_norm": 0.2540192926045016,
"acc_norm_stderr": 0.024723861504771686
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.28703703703703703,
"acc_stderr": 0.025171041915309684,
"acc_norm": 0.28703703703703703,
"acc_norm_stderr": 0.025171041915309684
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.24113475177304963,
"acc_stderr": 0.025518731049537773,
"acc_norm": 0.24113475177304963,
"acc_norm_stderr": 0.025518731049537773
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.2666232073011734,
"acc_stderr": 0.011293836031612145,
"acc_norm": 0.2666232073011734,
"acc_norm_stderr": 0.011293836031612145
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.21691176470588236,
"acc_stderr": 0.02503584522771125,
"acc_norm": 0.21691176470588236,
"acc_norm_stderr": 0.02503584522771125
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2630718954248366,
"acc_stderr": 0.017812676542320657,
"acc_norm": 0.2630718954248366,
"acc_norm_stderr": 0.017812676542320657
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.3181818181818182,
"acc_stderr": 0.04461272175910509,
"acc_norm": 0.3181818181818182,
"acc_norm_stderr": 0.04461272175910509
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.2816326530612245,
"acc_stderr": 0.02879518557429129,
"acc_norm": 0.2816326530612245,
"acc_norm_stderr": 0.02879518557429129
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.2885572139303483,
"acc_stderr": 0.03203841040213321,
"acc_norm": 0.2885572139303483,
"acc_norm_stderr": 0.03203841040213321
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370519,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370519
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.22807017543859648,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.22807017543859648,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22399020807833536,
"mc1_stderr": 0.014594964329474207,
"mc2": 0.39020128384630975,
"mc2_stderr": 0.014969681174597876
},
"harness|winogrande|5": {
"acc": 0.5430149960536701,
"acc_stderr": 0.01400038676159829
},
"harness|gsm8k|5": {
"acc": 0.013646702047005308,
"acc_stderr": 0.0031957470754807923
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Cosmos-AI/Cosmos-dataset | Cosmos-AI | "2024-03-11T13:48:51Z" | 0 | 0 | [
"language:en",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:37:52Z" | ---
language:
- en
pretty_name: Cosmos dataset v1
---
v1 |
DisgustingOzil/Academic_MCQ_Dataset | DisgustingOzil | "2024-03-10T11:46:38Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:38:28Z" | ---
dataset_info:
features:
- name: input
dtype: string
- name: instruction
dtype: string
- name: response
dtype: string
splits:
- name: train
num_bytes: 37358065
num_examples: 15812
download_size: 4463026
dataset_size: 37358065
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_frankenmerger__delta-4b-orange | open-llm-leaderboard-old | "2024-03-10T11:43:12Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T11:42:53Z" | ---
pretty_name: Evaluation run of frankenmerger/delta-4b-orange
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [frankenmerger/delta-4b-orange](https://huggingface.co/frankenmerger/delta-4b-orange)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_frankenmerger__delta-4b-orange\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T11:41:06.975310](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4b-orange/blob/main/results_2024-03-10T11-41-06.975310.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5672998500589107,\n\
\ \"acc_stderr\": 0.03374547957816994,\n \"acc_norm\": 0.568767763482637,\n\
\ \"acc_norm_stderr\": 0.0344408518430215,\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5681886396364959,\n\
\ \"mc2_stderr\": 0.015914705006773194\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.568259385665529,\n \"acc_stderr\": 0.014474591427196202,\n\
\ \"acc_norm\": 0.5887372013651877,\n \"acc_norm_stderr\": 0.014379441068522089\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5881298546106354,\n\
\ \"acc_stderr\": 0.004911659884506146,\n \"acc_norm\": 0.7658832901812388,\n\
\ \"acc_norm_stderr\": 0.004225800787050875\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.25,\n \"acc_stderr\": 0.04351941398892446,\n \
\ \"acc_norm\": 0.25,\n \"acc_norm_stderr\": 0.04351941398892446\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4888888888888889,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.4888888888888889,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5657894736842105,\n \"acc_stderr\": 0.040335656678483205,\n\
\ \"acc_norm\": 0.5657894736842105,\n \"acc_norm_stderr\": 0.040335656678483205\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.65,\n\
\ \"acc_stderr\": 0.047937248544110196,\n \"acc_norm\": 0.65,\n \
\ \"acc_norm_stderr\": 0.047937248544110196\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.030365050829115208,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.030365050829115208\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5833333333333334,\n\
\ \"acc_stderr\": 0.04122728707651282,\n \"acc_norm\": 0.5833333333333334,\n\
\ \"acc_norm_stderr\": 0.04122728707651282\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.04760952285695236,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.04760952285695236\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.49,\n \"acc_stderr\": 0.05024183937956913,\n \"acc_norm\": 0.49,\n\
\ \"acc_norm_stderr\": 0.05024183937956913\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001974,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001974\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5549132947976878,\n\
\ \"acc_stderr\": 0.03789401760283647,\n \"acc_norm\": 0.5549132947976878,\n\
\ \"acc_norm_stderr\": 0.03789401760283647\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929775,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929775\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.7,\n\
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4808510638297872,\n \"acc_stderr\": 0.03266204299064678,\n\
\ \"acc_norm\": 0.4808510638297872,\n \"acc_norm_stderr\": 0.03266204299064678\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.3333333333333333,\n\
\ \"acc_stderr\": 0.044346007015849245,\n \"acc_norm\": 0.3333333333333333,\n\
\ \"acc_norm_stderr\": 0.044346007015849245\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.41379310344827586,\n \"acc_stderr\": 0.04104269211806232,\n\
\ \"acc_norm\": 0.41379310344827586,\n \"acc_norm_stderr\": 0.04104269211806232\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.4603174603174603,\n\
\ \"acc_stderr\": 0.04458029125470973,\n \"acc_norm\": 0.4603174603174603,\n\
\ \"acc_norm_stderr\": 0.04458029125470973\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.6580645161290323,\n \"acc_stderr\": 0.02698528957655274,\n \"\
acc_norm\": 0.6580645161290323,\n \"acc_norm_stderr\": 0.02698528957655274\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.43842364532019706,\n \"acc_stderr\": 0.03491207857486518,\n \"\
acc_norm\": 0.43842364532019706,\n \"acc_norm_stderr\": 0.03491207857486518\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.66,\n \"acc_stderr\": 0.04760952285695237,\n \"acc_norm\"\
: 0.66,\n \"acc_norm_stderr\": 0.04760952285695237\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7151515151515152,\n \"acc_stderr\": 0.03524390844511781,\n\
\ \"acc_norm\": 0.7151515151515152,\n \"acc_norm_stderr\": 0.03524390844511781\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7272727272727273,\n \"acc_stderr\": 0.03173071239071724,\n \"\
acc_norm\": 0.7272727272727273,\n \"acc_norm_stderr\": 0.03173071239071724\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8238341968911918,\n \"acc_stderr\": 0.027493504244548057,\n\
\ \"acc_norm\": 0.8238341968911918,\n \"acc_norm_stderr\": 0.027493504244548057\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6025641025641025,\n \"acc_stderr\": 0.024811920017903836,\n\
\ \"acc_norm\": 0.6025641025641025,\n \"acc_norm_stderr\": 0.024811920017903836\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.2777777777777778,\n \"acc_stderr\": 0.02730914058823017,\n \
\ \"acc_norm\": 0.2777777777777778,\n \"acc_norm_stderr\": 0.02730914058823017\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6428571428571429,\n \"acc_stderr\": 0.031124619309328177,\n\
\ \"acc_norm\": 0.6428571428571429,\n \"acc_norm_stderr\": 0.031124619309328177\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3443708609271523,\n \"acc_stderr\": 0.038796870240733264,\n \"\
acc_norm\": 0.3443708609271523,\n \"acc_norm_stderr\": 0.038796870240733264\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7908256880733945,\n \"acc_stderr\": 0.017437937173343236,\n \"\
acc_norm\": 0.7908256880733945,\n \"acc_norm_stderr\": 0.017437937173343236\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538272,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7254901960784313,\n \"acc_stderr\": 0.0313217980308329,\n \"acc_norm\"\
: 0.7254901960784313,\n \"acc_norm_stderr\": 0.0313217980308329\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.7426160337552743,\n \"acc_stderr\": 0.0284588209914603,\n \"acc_norm\"\
: 0.7426160337552743,\n \"acc_norm_stderr\": 0.0284588209914603\n },\n\
\ \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6457399103139013,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.6457399103139013,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6793893129770993,\n \"acc_stderr\": 0.04093329229834278,\n\
\ \"acc_norm\": 0.6793893129770993,\n \"acc_norm_stderr\": 0.04093329229834278\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.03984979653302871,\n \"acc_norm\"\
: 0.743801652892562,\n \"acc_norm_stderr\": 0.03984979653302871\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.043300437496507416,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.043300437496507416\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7300613496932515,\n \"acc_stderr\": 0.034878251684978906,\n\
\ \"acc_norm\": 0.7300613496932515,\n \"acc_norm_stderr\": 0.034878251684978906\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.39285714285714285,\n\
\ \"acc_stderr\": 0.04635550135609976,\n \"acc_norm\": 0.39285714285714285,\n\
\ \"acc_norm_stderr\": 0.04635550135609976\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.782051282051282,\n\
\ \"acc_stderr\": 0.027046857630716684,\n \"acc_norm\": 0.782051282051282,\n\
\ \"acc_norm_stderr\": 0.027046857630716684\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.64,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.64,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.6896551724137931,\n\
\ \"acc_stderr\": 0.016543785026048308,\n \"acc_norm\": 0.6896551724137931,\n\
\ \"acc_norm_stderr\": 0.016543785026048308\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6589595375722543,\n \"acc_stderr\": 0.025522474632121615,\n\
\ \"acc_norm\": 0.6589595375722543,\n \"acc_norm_stderr\": 0.025522474632121615\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.293854748603352,\n\
\ \"acc_stderr\": 0.01523507577671961,\n \"acc_norm\": 0.293854748603352,\n\
\ \"acc_norm_stderr\": 0.01523507577671961\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5882352941176471,\n \"acc_stderr\": 0.028180596328259287,\n\
\ \"acc_norm\": 0.5882352941176471,\n \"acc_norm_stderr\": 0.028180596328259287\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6302250803858521,\n\
\ \"acc_stderr\": 0.027417996705630998,\n \"acc_norm\": 0.6302250803858521,\n\
\ \"acc_norm_stderr\": 0.027417996705630998\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6265432098765432,\n \"acc_stderr\": 0.02691500301138016,\n\
\ \"acc_norm\": 0.6265432098765432,\n \"acc_norm_stderr\": 0.02691500301138016\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4326241134751773,\n \"acc_stderr\": 0.029555454236778855,\n \
\ \"acc_norm\": 0.4326241134751773,\n \"acc_norm_stderr\": 0.029555454236778855\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4256844850065189,\n\
\ \"acc_stderr\": 0.012628393551811945,\n \"acc_norm\": 0.4256844850065189,\n\
\ \"acc_norm_stderr\": 0.012628393551811945\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5110294117647058,\n \"acc_stderr\": 0.030365446477275668,\n\
\ \"acc_norm\": 0.5110294117647058,\n \"acc_norm_stderr\": 0.030365446477275668\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.576797385620915,\n \"acc_stderr\": 0.01998780976948206,\n \
\ \"acc_norm\": 0.576797385620915,\n \"acc_norm_stderr\": 0.01998780976948206\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6181818181818182,\n\
\ \"acc_stderr\": 0.046534298079135075,\n \"acc_norm\": 0.6181818181818182,\n\
\ \"acc_norm_stderr\": 0.046534298079135075\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.5836734693877551,\n \"acc_stderr\": 0.031557828165561644,\n\
\ \"acc_norm\": 0.5836734693877551,\n \"acc_norm_stderr\": 0.031557828165561644\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7711442786069652,\n\
\ \"acc_stderr\": 0.02970528405677244,\n \"acc_norm\": 0.7711442786069652,\n\
\ \"acc_norm_stderr\": 0.02970528405677244\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.04560480215720684,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.04560480215720684\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4457831325301205,\n\
\ \"acc_stderr\": 0.03869543323472101,\n \"acc_norm\": 0.4457831325301205,\n\
\ \"acc_norm_stderr\": 0.03869543323472101\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.6608187134502924,\n \"acc_stderr\": 0.03631053496488904,\n\
\ \"acc_norm\": 0.6608187134502924,\n \"acc_norm_stderr\": 0.03631053496488904\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.401468788249694,\n\
\ \"mc1_stderr\": 0.017160273901693654,\n \"mc2\": 0.5681886396364959,\n\
\ \"mc2_stderr\": 0.015914705006773194\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7647987371744278,\n \"acc_stderr\": 0.011920008163650868\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.48142532221379836,\n \
\ \"acc_stderr\": 0.013762977910317584\n }\n}\n```"
repo_url: https://huggingface.co/frankenmerger/delta-4b-orange
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-41-06.975310.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T11-41-06.975310.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- '**/details_harness|winogrande|5_2024-03-10T11-41-06.975310.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T11-41-06.975310.parquet'
- config_name: results
data_files:
- split: 2024_03_10T11_41_06.975310
path:
- results_2024-03-10T11-41-06.975310.parquet
- split: latest
path:
- results_2024-03-10T11-41-06.975310.parquet
---
# Dataset Card for Evaluation run of frankenmerger/delta-4b-orange
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [frankenmerger/delta-4b-orange](https://huggingface.co/frankenmerger/delta-4b-orange) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_frankenmerger__delta-4b-orange",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T11:41:06.975310](https://huggingface.co/datasets/open-llm-leaderboard/details_frankenmerger__delta-4b-orange/blob/main/results_2024-03-10T11-41-06.975310.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5672998500589107,
"acc_stderr": 0.03374547957816994,
"acc_norm": 0.568767763482637,
"acc_norm_stderr": 0.0344408518430215,
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5681886396364959,
"mc2_stderr": 0.015914705006773194
},
"harness|arc:challenge|25": {
"acc": 0.568259385665529,
"acc_stderr": 0.014474591427196202,
"acc_norm": 0.5887372013651877,
"acc_norm_stderr": 0.014379441068522089
},
"harness|hellaswag|10": {
"acc": 0.5881298546106354,
"acc_stderr": 0.004911659884506146,
"acc_norm": 0.7658832901812388,
"acc_norm_stderr": 0.004225800787050875
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.25,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.25,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4888888888888889,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.4888888888888889,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5657894736842105,
"acc_stderr": 0.040335656678483205,
"acc_norm": 0.5657894736842105,
"acc_norm_stderr": 0.040335656678483205
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.65,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.65,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.030365050829115208,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.030365050829115208
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5833333333333334,
"acc_stderr": 0.04122728707651282,
"acc_norm": 0.5833333333333334,
"acc_norm_stderr": 0.04122728707651282
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.34,
"acc_stderr": 0.04760952285695236,
"acc_norm": 0.34,
"acc_norm_stderr": 0.04760952285695236
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.49,
"acc_stderr": 0.05024183937956913,
"acc_norm": 0.49,
"acc_norm_stderr": 0.05024183937956913
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5549132947976878,
"acc_stderr": 0.03789401760283647,
"acc_norm": 0.5549132947976878,
"acc_norm_stderr": 0.03789401760283647
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929775,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929775
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4808510638297872,
"acc_stderr": 0.03266204299064678,
"acc_norm": 0.4808510638297872,
"acc_norm_stderr": 0.03266204299064678
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.044346007015849245,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.044346007015849245
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.41379310344827586,
"acc_stderr": 0.04104269211806232,
"acc_norm": 0.41379310344827586,
"acc_norm_stderr": 0.04104269211806232
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.4603174603174603,
"acc_stderr": 0.04458029125470973,
"acc_norm": 0.4603174603174603,
"acc_norm_stderr": 0.04458029125470973
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.02698528957655274,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.02698528957655274
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.43842364532019706,
"acc_stderr": 0.03491207857486518,
"acc_norm": 0.43842364532019706,
"acc_norm_stderr": 0.03491207857486518
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.66,
"acc_stderr": 0.04760952285695237,
"acc_norm": 0.66,
"acc_norm_stderr": 0.04760952285695237
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7151515151515152,
"acc_stderr": 0.03524390844511781,
"acc_norm": 0.7151515151515152,
"acc_norm_stderr": 0.03524390844511781
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7272727272727273,
"acc_stderr": 0.03173071239071724,
"acc_norm": 0.7272727272727273,
"acc_norm_stderr": 0.03173071239071724
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8238341968911918,
"acc_stderr": 0.027493504244548057,
"acc_norm": 0.8238341968911918,
"acc_norm_stderr": 0.027493504244548057
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6025641025641025,
"acc_stderr": 0.024811920017903836,
"acc_norm": 0.6025641025641025,
"acc_norm_stderr": 0.024811920017903836
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.2777777777777778,
"acc_stderr": 0.02730914058823017,
"acc_norm": 0.2777777777777778,
"acc_norm_stderr": 0.02730914058823017
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6428571428571429,
"acc_stderr": 0.031124619309328177,
"acc_norm": 0.6428571428571429,
"acc_norm_stderr": 0.031124619309328177
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3443708609271523,
"acc_stderr": 0.038796870240733264,
"acc_norm": 0.3443708609271523,
"acc_norm_stderr": 0.038796870240733264
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7908256880733945,
"acc_stderr": 0.017437937173343236,
"acc_norm": 0.7908256880733945,
"acc_norm_stderr": 0.017437937173343236
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.0313217980308329,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.0313217980308329
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7426160337552743,
"acc_stderr": 0.0284588209914603,
"acc_norm": 0.7426160337552743,
"acc_norm_stderr": 0.0284588209914603
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6457399103139013,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.6457399103139013,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6793893129770993,
"acc_stderr": 0.04093329229834278,
"acc_norm": 0.6793893129770993,
"acc_norm_stderr": 0.04093329229834278
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.03984979653302871,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.03984979653302871
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.043300437496507416,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.043300437496507416
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7300613496932515,
"acc_stderr": 0.034878251684978906,
"acc_norm": 0.7300613496932515,
"acc_norm_stderr": 0.034878251684978906
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.39285714285714285,
"acc_stderr": 0.04635550135609976,
"acc_norm": 0.39285714285714285,
"acc_norm_stderr": 0.04635550135609976
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.782051282051282,
"acc_stderr": 0.027046857630716684,
"acc_norm": 0.782051282051282,
"acc_norm_stderr": 0.027046857630716684
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.6896551724137931,
"acc_stderr": 0.016543785026048308,
"acc_norm": 0.6896551724137931,
"acc_norm_stderr": 0.016543785026048308
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6589595375722543,
"acc_stderr": 0.025522474632121615,
"acc_norm": 0.6589595375722543,
"acc_norm_stderr": 0.025522474632121615
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.293854748603352,
"acc_stderr": 0.01523507577671961,
"acc_norm": 0.293854748603352,
"acc_norm_stderr": 0.01523507577671961
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5882352941176471,
"acc_stderr": 0.028180596328259287,
"acc_norm": 0.5882352941176471,
"acc_norm_stderr": 0.028180596328259287
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6302250803858521,
"acc_stderr": 0.027417996705630998,
"acc_norm": 0.6302250803858521,
"acc_norm_stderr": 0.027417996705630998
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6265432098765432,
"acc_stderr": 0.02691500301138016,
"acc_norm": 0.6265432098765432,
"acc_norm_stderr": 0.02691500301138016
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4326241134751773,
"acc_stderr": 0.029555454236778855,
"acc_norm": 0.4326241134751773,
"acc_norm_stderr": 0.029555454236778855
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4256844850065189,
"acc_stderr": 0.012628393551811945,
"acc_norm": 0.4256844850065189,
"acc_norm_stderr": 0.012628393551811945
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5110294117647058,
"acc_stderr": 0.030365446477275668,
"acc_norm": 0.5110294117647058,
"acc_norm_stderr": 0.030365446477275668
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.576797385620915,
"acc_stderr": 0.01998780976948206,
"acc_norm": 0.576797385620915,
"acc_norm_stderr": 0.01998780976948206
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6181818181818182,
"acc_stderr": 0.046534298079135075,
"acc_norm": 0.6181818181818182,
"acc_norm_stderr": 0.046534298079135075
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.5836734693877551,
"acc_stderr": 0.031557828165561644,
"acc_norm": 0.5836734693877551,
"acc_norm_stderr": 0.031557828165561644
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7711442786069652,
"acc_stderr": 0.02970528405677244,
"acc_norm": 0.7711442786069652,
"acc_norm_stderr": 0.02970528405677244
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.71,
"acc_stderr": 0.04560480215720684,
"acc_norm": 0.71,
"acc_norm_stderr": 0.04560480215720684
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4457831325301205,
"acc_stderr": 0.03869543323472101,
"acc_norm": 0.4457831325301205,
"acc_norm_stderr": 0.03869543323472101
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.6608187134502924,
"acc_stderr": 0.03631053496488904,
"acc_norm": 0.6608187134502924,
"acc_norm_stderr": 0.03631053496488904
},
"harness|truthfulqa:mc|0": {
"mc1": 0.401468788249694,
"mc1_stderr": 0.017160273901693654,
"mc2": 0.5681886396364959,
"mc2_stderr": 0.015914705006773194
},
"harness|winogrande|5": {
"acc": 0.7647987371744278,
"acc_stderr": 0.011920008163650868
},
"harness|gsm8k|5": {
"acc": 0.48142532221379836,
"acc_stderr": 0.013762977910317584
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Lonewolf2441139/gcdata | Lonewolf2441139 | "2024-03-10T11:53:56Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:45:37Z" | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 1673415
num_examples: 967
download_size: 575440
dataset_size: 1673415
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Schibsted/vg-front-title | Schibsted | "2024-06-06T13:53:31Z" | 0 | 1 | [
"license:cc-by-nc-nd-4.0",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:47:57Z" | ---
license: cc-by-nc-nd-4.0
---
# VG front titles
From published front titles in Schisted Media's Verdens Gang newsroom. |
Yonchanok/pro_small_Test_cat_cii | Yonchanok | "2024-04-14T04:36:39Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T11:55:32Z" | ---
dataset_info:
features:
- name: Q
dtype: string
- name: A
dtype: string
- name: Short_A
dtype: string
- name: I
dtype: string
- name: Type
dtype: string
- name: instruction
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 103906616
num_examples: 45232
download_size: 4784719
dataset_size: 103906616
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
yasirfaizahmed/good_tweet_bat_tweet | yasirfaizahmed | "2024-03-10T12:24:20Z" | 0 | 0 | [
"task_categories:text-classification",
"language:en",
"license:apache-2.0",
"size_categories:1K<n<10K",
"region:us",
"not-for-all-audiences"
] | [
"text-classification"
] | "2024-03-10T12:08:14Z" | ---
license: apache-2.0
task_categories:
- text-classification
language:
- en
tags:
- not-for-all-audiences
pretty_name: good_tweet_bad_tweet
size_categories:
- 1K<n<10K
--- |
pmarmik/filtered_samvaad | pmarmik | "2024-03-10T12:30:19Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T12:13:05Z" | ---
dataset_info:
features:
- name: messages
dtype: string
splits:
- name: train
num_bytes: 325833992.9441444
num_examples: 68000
- name: validation
num_bytes: 45520925.48484371
num_examples: 9500
- name: test
num_bytes: 23958381.834128268
num_examples: 5000
download_size: 166833215
dataset_size: 395313300.2631164
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_feeltheAGI__mistral-maths7B | open-llm-leaderboard-old | "2024-03-10T12:13:50Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T12:13:31Z" | ---
pretty_name: Evaluation run of feeltheAGI/mistral-maths7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [feeltheAGI/mistral-maths7B](https://huggingface.co/feeltheAGI/mistral-maths7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_feeltheAGI__mistral-maths7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T12:11:08.454866](https://huggingface.co/datasets/open-llm-leaderboard/details_feeltheAGI__mistral-maths7B/blob/main/results_2024-03-10T12-11-08.454866.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5469269437334953,\n\
\ \"acc_stderr\": 0.03410012829437269,\n \"acc_norm\": 0.5483740722847594,\n\
\ \"acc_norm_stderr\": 0.03479918880134391,\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.5730029044942812,\n\
\ \"mc2_stderr\": 0.01530435216141352\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5042662116040956,\n \"acc_stderr\": 0.014610858923956948,\n\
\ \"acc_norm\": 0.5204778156996587,\n \"acc_norm_stderr\": 0.014599131353035009\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5595498904600678,\n\
\ \"acc_stderr\": 0.004954265595373459,\n \"acc_norm\": 0.7476598287193786,\n\
\ \"acc_norm_stderr\": 0.004334676952703863\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.4148148148148148,\n\
\ \"acc_stderr\": 0.04256193767901407,\n \"acc_norm\": 0.4148148148148148,\n\
\ \"acc_norm_stderr\": 0.04256193767901407\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5526315789473685,\n \"acc_stderr\": 0.040463368839782514,\n\
\ \"acc_norm\": 0.5526315789473685,\n \"acc_norm_stderr\": 0.040463368839782514\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.03036505082911521,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.03036505082911521\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5902777777777778,\n\
\ \"acc_stderr\": 0.04112490974670787,\n \"acc_norm\": 0.5902777777777778,\n\
\ \"acc_norm_stderr\": 0.04112490974670787\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.38,\n \"acc_stderr\": 0.04878317312145632,\n \
\ \"acc_norm\": 0.38,\n \"acc_norm_stderr\": 0.04878317312145632\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.52,\n \"acc_stderr\": 0.050211673156867795,\n \"acc_norm\": 0.52,\n\
\ \"acc_norm_stderr\": 0.050211673156867795\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.04793724854411019,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.04793724854411019\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5144508670520231,\n\
\ \"acc_stderr\": 0.03810871630454764,\n \"acc_norm\": 0.5144508670520231,\n\
\ \"acc_norm_stderr\": 0.03810871630454764\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.3235294117647059,\n \"acc_stderr\": 0.04655010411319619,\n\
\ \"acc_norm\": 0.3235294117647059,\n \"acc_norm_stderr\": 0.04655010411319619\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \"acc_norm\": 0.73,\n\
\ \"acc_norm_stderr\": 0.044619604333847394\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.44680851063829785,\n \"acc_stderr\": 0.0325005368436584,\n\
\ \"acc_norm\": 0.44680851063829785,\n \"acc_norm_stderr\": 0.0325005368436584\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.34210526315789475,\n\
\ \"acc_stderr\": 0.04462917535336936,\n \"acc_norm\": 0.34210526315789475,\n\
\ \"acc_norm_stderr\": 0.04462917535336936\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.503448275862069,\n \"acc_stderr\": 0.04166567577101579,\n\
\ \"acc_norm\": 0.503448275862069,\n \"acc_norm_stderr\": 0.04166567577101579\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3333333333333333,\n \"acc_stderr\": 0.024278568024307702,\n \"\
acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.024278568024307702\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.3968253968253968,\n\
\ \"acc_stderr\": 0.04375888492727061,\n \"acc_norm\": 0.3968253968253968,\n\
\ \"acc_norm_stderr\": 0.04375888492727061\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6419354838709678,\n\
\ \"acc_stderr\": 0.027273890594300642,\n \"acc_norm\": 0.6419354838709678,\n\
\ \"acc_norm_stderr\": 0.027273890594300642\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4039408866995074,\n \"acc_stderr\": 0.0345245390382204,\n\
\ \"acc_norm\": 0.4039408866995074,\n \"acc_norm_stderr\": 0.0345245390382204\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.51,\n \"acc_stderr\": 0.05024183937956912,\n \"acc_norm\"\
: 0.51,\n \"acc_norm_stderr\": 0.05024183937956912\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6848484848484848,\n \"acc_stderr\": 0.0362773057502241,\n\
\ \"acc_norm\": 0.6848484848484848,\n \"acc_norm_stderr\": 0.0362773057502241\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7222222222222222,\n \"acc_stderr\": 0.03191178226713548,\n \"\
acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.03191178226713548\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7357512953367875,\n \"acc_stderr\": 0.03182155050916646,\n\
\ \"acc_norm\": 0.7357512953367875,\n \"acc_norm_stderr\": 0.03182155050916646\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5,\n \"acc_stderr\": 0.02535100632816969,\n \"acc_norm\"\
: 0.5,\n \"acc_norm_stderr\": 0.02535100632816969\n },\n \"harness|hendrycksTest-high_school_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.027940457136228412,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.027940457136228412\n \
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \"\
acc\": 0.5252100840336135,\n \"acc_stderr\": 0.03243718055137411,\n \
\ \"acc_norm\": 0.5252100840336135,\n \"acc_norm_stderr\": 0.03243718055137411\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7394495412844037,\n \"acc_stderr\": 0.01881918203485007,\n \"\
acc_norm\": 0.7394495412844037,\n \"acc_norm_stderr\": 0.01881918203485007\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.44907407407407407,\n \"acc_stderr\": 0.03392238405321616,\n \"\
acc_norm\": 0.44907407407407407,\n \"acc_norm_stderr\": 0.03392238405321616\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.6813725490196079,\n \"acc_stderr\": 0.0327028718148208,\n \"acc_norm\"\
: 0.6813725490196079,\n \"acc_norm_stderr\": 0.0327028718148208\n },\n\
\ \"harness|hendrycksTest-high_school_world_history|5\": {\n \"acc\":\
\ 0.6751054852320675,\n \"acc_stderr\": 0.030486039389105293,\n \"\
acc_norm\": 0.6751054852320675,\n \"acc_norm_stderr\": 0.030486039389105293\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6188340807174888,\n\
\ \"acc_stderr\": 0.03259625118416827,\n \"acc_norm\": 0.6188340807174888,\n\
\ \"acc_norm_stderr\": 0.03259625118416827\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.6611570247933884,\n \"acc_stderr\": 0.04320767807536671,\n \"\
acc_norm\": 0.6611570247933884,\n \"acc_norm_stderr\": 0.04320767807536671\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.6574074074074074,\n\
\ \"acc_stderr\": 0.045879047413018105,\n \"acc_norm\": 0.6574074074074074,\n\
\ \"acc_norm_stderr\": 0.045879047413018105\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.6441717791411042,\n \"acc_stderr\": 0.03761521380046734,\n\
\ \"acc_norm\": 0.6441717791411042,\n \"acc_norm_stderr\": 0.03761521380046734\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7087378640776699,\n \"acc_stderr\": 0.044986763205729224,\n\
\ \"acc_norm\": 0.7087378640776699,\n \"acc_norm_stderr\": 0.044986763205729224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8418803418803419,\n\
\ \"acc_stderr\": 0.023902325549560396,\n \"acc_norm\": 0.8418803418803419,\n\
\ \"acc_norm_stderr\": 0.023902325549560396\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.65,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.65,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7547892720306514,\n\
\ \"acc_stderr\": 0.01538435228454393,\n \"acc_norm\": 0.7547892720306514,\n\
\ \"acc_norm_stderr\": 0.01538435228454393\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.5722543352601156,\n \"acc_stderr\": 0.026636539741116082,\n\
\ \"acc_norm\": 0.5722543352601156,\n \"acc_norm_stderr\": 0.026636539741116082\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.22905027932960895,\n\
\ \"acc_stderr\": 0.014054314935614569,\n \"acc_norm\": 0.22905027932960895,\n\
\ \"acc_norm_stderr\": 0.014054314935614569\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.5816993464052288,\n \"acc_stderr\": 0.028245134024387292,\n\
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.028245134024387292\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6270096463022508,\n\
\ \"acc_stderr\": 0.0274666102131401,\n \"acc_norm\": 0.6270096463022508,\n\
\ \"acc_norm_stderr\": 0.0274666102131401\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.5925925925925926,\n \"acc_stderr\": 0.027339546640662734,\n\
\ \"acc_norm\": 0.5925925925925926,\n \"acc_norm_stderr\": 0.027339546640662734\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.38652482269503546,\n \"acc_stderr\": 0.029049190342543465,\n \
\ \"acc_norm\": 0.38652482269503546,\n \"acc_norm_stderr\": 0.029049190342543465\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.3813559322033898,\n\
\ \"acc_stderr\": 0.012405509401888122,\n \"acc_norm\": 0.3813559322033898,\n\
\ \"acc_norm_stderr\": 0.012405509401888122\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5441176470588235,\n \"acc_stderr\": 0.030254372573976715,\n\
\ \"acc_norm\": 0.5441176470588235,\n \"acc_norm_stderr\": 0.030254372573976715\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5277777777777778,\n \"acc_stderr\": 0.020196594933541194,\n \
\ \"acc_norm\": 0.5277777777777778,\n \"acc_norm_stderr\": 0.020196594933541194\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6636363636363637,\n\
\ \"acc_stderr\": 0.04525393596302505,\n \"acc_norm\": 0.6636363636363637,\n\
\ \"acc_norm_stderr\": 0.04525393596302505\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6775510204081633,\n \"acc_stderr\": 0.02992310056368391,\n\
\ \"acc_norm\": 0.6775510204081633,\n \"acc_norm_stderr\": 0.02992310056368391\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7014925373134329,\n\
\ \"acc_stderr\": 0.03235743789355043,\n \"acc_norm\": 0.7014925373134329,\n\
\ \"acc_norm_stderr\": 0.03235743789355043\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.74,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.74,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.42168674698795183,\n\
\ \"acc_stderr\": 0.03844453181770917,\n \"acc_norm\": 0.42168674698795183,\n\
\ \"acc_norm_stderr\": 0.03844453181770917\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7017543859649122,\n \"acc_stderr\": 0.03508771929824564,\n\
\ \"acc_norm\": 0.7017543859649122,\n \"acc_norm_stderr\": 0.03508771929824564\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.40636474908200737,\n\
\ \"mc1_stderr\": 0.017193835812093904,\n \"mc2\": 0.5730029044942812,\n\
\ \"mc2_stderr\": 0.01530435216141352\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7245461720599842,\n \"acc_stderr\": 0.01255569005570953\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.488248673237301,\n \
\ \"acc_stderr\": 0.013768680408142796\n }\n}\n```"
repo_url: https://huggingface.co/feeltheAGI/mistral-maths7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-11-08.454866.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-11-08.454866.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- '**/details_harness|winogrande|5_2024-03-10T12-11-08.454866.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T12-11-08.454866.parquet'
- config_name: results
data_files:
- split: 2024_03_10T12_11_08.454866
path:
- results_2024-03-10T12-11-08.454866.parquet
- split: latest
path:
- results_2024-03-10T12-11-08.454866.parquet
---
# Dataset Card for Evaluation run of feeltheAGI/mistral-maths7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [feeltheAGI/mistral-maths7B](https://huggingface.co/feeltheAGI/mistral-maths7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_feeltheAGI__mistral-maths7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T12:11:08.454866](https://huggingface.co/datasets/open-llm-leaderboard/details_feeltheAGI__mistral-maths7B/blob/main/results_2024-03-10T12-11-08.454866.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5469269437334953,
"acc_stderr": 0.03410012829437269,
"acc_norm": 0.5483740722847594,
"acc_norm_stderr": 0.03479918880134391,
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.5730029044942812,
"mc2_stderr": 0.01530435216141352
},
"harness|arc:challenge|25": {
"acc": 0.5042662116040956,
"acc_stderr": 0.014610858923956948,
"acc_norm": 0.5204778156996587,
"acc_norm_stderr": 0.014599131353035009
},
"harness|hellaswag|10": {
"acc": 0.5595498904600678,
"acc_stderr": 0.004954265595373459,
"acc_norm": 0.7476598287193786,
"acc_norm_stderr": 0.004334676952703863
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.4148148148148148,
"acc_stderr": 0.04256193767901407,
"acc_norm": 0.4148148148148148,
"acc_norm_stderr": 0.04256193767901407
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5526315789473685,
"acc_stderr": 0.040463368839782514,
"acc_norm": 0.5526315789473685,
"acc_norm_stderr": 0.040463368839782514
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5902777777777778,
"acc_stderr": 0.04112490974670787,
"acc_norm": 0.5902777777777778,
"acc_norm_stderr": 0.04112490974670787
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.38,
"acc_stderr": 0.04878317312145632,
"acc_norm": 0.38,
"acc_norm_stderr": 0.04878317312145632
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.52,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.52,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.35,
"acc_stderr": 0.04793724854411019,
"acc_norm": 0.35,
"acc_norm_stderr": 0.04793724854411019
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5144508670520231,
"acc_stderr": 0.03810871630454764,
"acc_norm": 0.5144508670520231,
"acc_norm_stderr": 0.03810871630454764
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.3235294117647059,
"acc_stderr": 0.04655010411319619,
"acc_norm": 0.3235294117647059,
"acc_norm_stderr": 0.04655010411319619
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.44680851063829785,
"acc_stderr": 0.0325005368436584,
"acc_norm": 0.44680851063829785,
"acc_norm_stderr": 0.0325005368436584
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.34210526315789475,
"acc_stderr": 0.04462917535336936,
"acc_norm": 0.34210526315789475,
"acc_norm_stderr": 0.04462917535336936
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.503448275862069,
"acc_stderr": 0.04166567577101579,
"acc_norm": 0.503448275862069,
"acc_norm_stderr": 0.04166567577101579
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.024278568024307702,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.024278568024307702
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.04375888492727061,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.04375888492727061
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6419354838709678,
"acc_stderr": 0.027273890594300642,
"acc_norm": 0.6419354838709678,
"acc_norm_stderr": 0.027273890594300642
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4039408866995074,
"acc_stderr": 0.0345245390382204,
"acc_norm": 0.4039408866995074,
"acc_norm_stderr": 0.0345245390382204
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.51,
"acc_stderr": 0.05024183937956912,
"acc_norm": 0.51,
"acc_norm_stderr": 0.05024183937956912
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6848484848484848,
"acc_stderr": 0.0362773057502241,
"acc_norm": 0.6848484848484848,
"acc_norm_stderr": 0.0362773057502241
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.03191178226713548,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.03191178226713548
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7357512953367875,
"acc_stderr": 0.03182155050916646,
"acc_norm": 0.7357512953367875,
"acc_norm_stderr": 0.03182155050916646
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5,
"acc_stderr": 0.02535100632816969,
"acc_norm": 0.5,
"acc_norm_stderr": 0.02535100632816969
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.027940457136228412,
"acc_norm": 0.3,
"acc_norm_stderr": 0.027940457136228412
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.5252100840336135,
"acc_stderr": 0.03243718055137411,
"acc_norm": 0.5252100840336135,
"acc_norm_stderr": 0.03243718055137411
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7394495412844037,
"acc_stderr": 0.01881918203485007,
"acc_norm": 0.7394495412844037,
"acc_norm_stderr": 0.01881918203485007
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.44907407407407407,
"acc_stderr": 0.03392238405321616,
"acc_norm": 0.44907407407407407,
"acc_norm_stderr": 0.03392238405321616
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.6813725490196079,
"acc_stderr": 0.0327028718148208,
"acc_norm": 0.6813725490196079,
"acc_norm_stderr": 0.0327028718148208
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.6751054852320675,
"acc_stderr": 0.030486039389105293,
"acc_norm": 0.6751054852320675,
"acc_norm_stderr": 0.030486039389105293
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6188340807174888,
"acc_stderr": 0.03259625118416827,
"acc_norm": 0.6188340807174888,
"acc_norm_stderr": 0.03259625118416827
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.6611570247933884,
"acc_stderr": 0.04320767807536671,
"acc_norm": 0.6611570247933884,
"acc_norm_stderr": 0.04320767807536671
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.6574074074074074,
"acc_stderr": 0.045879047413018105,
"acc_norm": 0.6574074074074074,
"acc_norm_stderr": 0.045879047413018105
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.6441717791411042,
"acc_stderr": 0.03761521380046734,
"acc_norm": 0.6441717791411042,
"acc_norm_stderr": 0.03761521380046734
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7087378640776699,
"acc_stderr": 0.044986763205729224,
"acc_norm": 0.7087378640776699,
"acc_norm_stderr": 0.044986763205729224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8418803418803419,
"acc_stderr": 0.023902325549560396,
"acc_norm": 0.8418803418803419,
"acc_norm_stderr": 0.023902325549560396
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.65,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.65,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7547892720306514,
"acc_stderr": 0.01538435228454393,
"acc_norm": 0.7547892720306514,
"acc_norm_stderr": 0.01538435228454393
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.5722543352601156,
"acc_stderr": 0.026636539741116082,
"acc_norm": 0.5722543352601156,
"acc_norm_stderr": 0.026636539741116082
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.22905027932960895,
"acc_stderr": 0.014054314935614569,
"acc_norm": 0.22905027932960895,
"acc_norm_stderr": 0.014054314935614569
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.028245134024387292,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.028245134024387292
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6270096463022508,
"acc_stderr": 0.0274666102131401,
"acc_norm": 0.6270096463022508,
"acc_norm_stderr": 0.0274666102131401
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.027339546640662734,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.027339546640662734
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.38652482269503546,
"acc_stderr": 0.029049190342543465,
"acc_norm": 0.38652482269503546,
"acc_norm_stderr": 0.029049190342543465
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.3813559322033898,
"acc_stderr": 0.012405509401888122,
"acc_norm": 0.3813559322033898,
"acc_norm_stderr": 0.012405509401888122
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5441176470588235,
"acc_stderr": 0.030254372573976715,
"acc_norm": 0.5441176470588235,
"acc_norm_stderr": 0.030254372573976715
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5277777777777778,
"acc_stderr": 0.020196594933541194,
"acc_norm": 0.5277777777777778,
"acc_norm_stderr": 0.020196594933541194
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6636363636363637,
"acc_stderr": 0.04525393596302505,
"acc_norm": 0.6636363636363637,
"acc_norm_stderr": 0.04525393596302505
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6775510204081633,
"acc_stderr": 0.02992310056368391,
"acc_norm": 0.6775510204081633,
"acc_norm_stderr": 0.02992310056368391
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7014925373134329,
"acc_stderr": 0.03235743789355043,
"acc_norm": 0.7014925373134329,
"acc_norm_stderr": 0.03235743789355043
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.74,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.74,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-virology|5": {
"acc": 0.42168674698795183,
"acc_stderr": 0.03844453181770917,
"acc_norm": 0.42168674698795183,
"acc_norm_stderr": 0.03844453181770917
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7017543859649122,
"acc_stderr": 0.03508771929824564,
"acc_norm": 0.7017543859649122,
"acc_norm_stderr": 0.03508771929824564
},
"harness|truthfulqa:mc|0": {
"mc1": 0.40636474908200737,
"mc1_stderr": 0.017193835812093904,
"mc2": 0.5730029044942812,
"mc2_stderr": 0.01530435216141352
},
"harness|winogrande|5": {
"acc": 0.7245461720599842,
"acc_stderr": 0.01255569005570953
},
"harness|gsm8k|5": {
"acc": 0.488248673237301,
"acc_stderr": 0.013768680408142796
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
entrigna/custom-qa-v2-train-ds | entrigna | "2024-03-10T12:30:49Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T12:14:53Z" | ---
license: apache-2.0
dataset_info:
features:
- name: qa_instruction
dtype: string
splits:
- name: train
num_bytes: 7992652
num_examples: 9000
download_size: 4841482
dataset_size: 7992652
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
entrigna/custom-qa-v2-test-ds | entrigna | "2024-03-10T12:22:57Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T12:15:12Z" | ---
license: apache-2.0
dataset_info:
features:
- name: qa_instruction
dtype: string
splits:
- name: test
num_bytes: 897014
num_examples: 1000
download_size: 548765
dataset_size: 897014
configs:
- config_name: default
data_files:
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_theNovaAI__Supernova-experimental | open-llm-leaderboard-old | "2024-03-10T12:36:38Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T12:36:18Z" | ---
pretty_name: Evaluation run of theNovaAI/Supernova-experimental
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [theNovaAI/Supernova-experimental](https://huggingface.co/theNovaAI/Supernova-experimental)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_theNovaAI__Supernova-experimental\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T12:34:01.420352](https://huggingface.co/datasets/open-llm-leaderboard/details_theNovaAI__Supernova-experimental/blob/main/results_2024-03-10T12-34-01.420352.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5663270464450889,\n\
\ \"acc_stderr\": 0.03356166882892655,\n \"acc_norm\": 0.5715895778655974,\n\
\ \"acc_norm_stderr\": 0.03426551856832842,\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.49371884206186833,\n\
\ \"mc2_stderr\": 0.015090933240631366\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5921501706484642,\n \"acc_stderr\": 0.014361097288449703,\n\
\ \"acc_norm\": 0.6305460750853242,\n \"acc_norm_stderr\": 0.014104578366491887\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6363274248157738,\n\
\ \"acc_stderr\": 0.004800728138792395,\n \"acc_norm\": 0.8365863373829915,\n\
\ \"acc_norm_stderr\": 0.0036898701424130753\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5037037037037037,\n\
\ \"acc_stderr\": 0.04319223625811331,\n \"acc_norm\": 0.5037037037037037,\n\
\ \"acc_norm_stderr\": 0.04319223625811331\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.5394736842105263,\n \"acc_stderr\": 0.04056242252249034,\n\
\ \"acc_norm\": 0.5394736842105263,\n \"acc_norm_stderr\": 0.04056242252249034\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.56,\n\
\ \"acc_stderr\": 0.04988876515698589,\n \"acc_norm\": 0.56,\n \
\ \"acc_norm_stderr\": 0.04988876515698589\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.5811320754716981,\n \"acc_stderr\": 0.03036505082911521,\n\
\ \"acc_norm\": 0.5811320754716981,\n \"acc_norm_stderr\": 0.03036505082911521\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.5972222222222222,\n\
\ \"acc_stderr\": 0.04101405519842426,\n \"acc_norm\": 0.5972222222222222,\n\
\ \"acc_norm_stderr\": 0.04101405519842426\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252604,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252604\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\"\
: 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.41,\n\
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.36,\n \"acc_stderr\": 0.04824181513244218,\n \
\ \"acc_norm\": 0.36,\n \"acc_norm_stderr\": 0.04824181513244218\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5491329479768786,\n\
\ \"acc_stderr\": 0.0379401267469703,\n \"acc_norm\": 0.5491329479768786,\n\
\ \"acc_norm_stderr\": 0.0379401267469703\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.2647058823529412,\n \"acc_stderr\": 0.04389869956808777,\n\
\ \"acc_norm\": 0.2647058823529412,\n \"acc_norm_stderr\": 0.04389869956808777\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.72,\n \"acc_stderr\": 0.04512608598542129,\n \"acc_norm\": 0.72,\n\
\ \"acc_norm_stderr\": 0.04512608598542129\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.4595744680851064,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.4595744680851064,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5172413793103449,\n \"acc_stderr\": 0.04164188720169375,\n\
\ \"acc_norm\": 0.5172413793103449,\n \"acc_norm_stderr\": 0.04164188720169375\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.328042328042328,\n \"acc_stderr\": 0.024180497164376914,\n \"\
acc_norm\": 0.328042328042328,\n \"acc_norm_stderr\": 0.024180497164376914\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.373015873015873,\n\
\ \"acc_stderr\": 0.04325506042017086,\n \"acc_norm\": 0.373015873015873,\n\
\ \"acc_norm_stderr\": 0.04325506042017086\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6580645161290323,\n\
\ \"acc_stderr\": 0.026985289576552746,\n \"acc_norm\": 0.6580645161290323,\n\
\ \"acc_norm_stderr\": 0.026985289576552746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4482758620689655,\n \"acc_stderr\": 0.03499113137676744,\n\
\ \"acc_norm\": 0.4482758620689655,\n \"acc_norm_stderr\": 0.03499113137676744\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.6727272727272727,\n \"acc_stderr\": 0.03663974994391244,\n\
\ \"acc_norm\": 0.6727272727272727,\n \"acc_norm_stderr\": 0.03663974994391244\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7171717171717171,\n \"acc_stderr\": 0.03208779558786752,\n \"\
acc_norm\": 0.7171717171717171,\n \"acc_norm_stderr\": 0.03208779558786752\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.7979274611398963,\n \"acc_stderr\": 0.02897908979429673,\n\
\ \"acc_norm\": 0.7979274611398963,\n \"acc_norm_stderr\": 0.02897908979429673\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.517948717948718,\n \"acc_stderr\": 0.025334667080954925,\n \
\ \"acc_norm\": 0.517948717948718,\n \"acc_norm_stderr\": 0.025334667080954925\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3296296296296296,\n \"acc_stderr\": 0.028661201116524575,\n \
\ \"acc_norm\": 0.3296296296296296,\n \"acc_norm_stderr\": 0.028661201116524575\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6008403361344538,\n \"acc_stderr\": 0.03181110032413926,\n \
\ \"acc_norm\": 0.6008403361344538,\n \"acc_norm_stderr\": 0.03181110032413926\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.31788079470198677,\n \"acc_stderr\": 0.038020397601079024,\n \"\
acc_norm\": 0.31788079470198677,\n \"acc_norm_stderr\": 0.038020397601079024\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7486238532110092,\n \"acc_stderr\": 0.018599206360287415,\n \"\
acc_norm\": 0.7486238532110092,\n \"acc_norm_stderr\": 0.018599206360287415\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.75,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.7721518987341772,\n \"acc_stderr\": 0.02730348459906943,\n\
\ \"acc_norm\": 0.7721518987341772,\n \"acc_norm_stderr\": 0.02730348459906943\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6412213740458015,\n \"acc_stderr\": 0.04206739313864908,\n\
\ \"acc_norm\": 0.6412213740458015,\n \"acc_norm_stderr\": 0.04206739313864908\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.743801652892562,\n \"acc_stderr\": 0.039849796533028725,\n \"\
acc_norm\": 0.743801652892562,\n \"acc_norm_stderr\": 0.039849796533028725\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7314814814814815,\n\
\ \"acc_stderr\": 0.042844679680521934,\n \"acc_norm\": 0.7314814814814815,\n\
\ \"acc_norm_stderr\": 0.042844679680521934\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7116564417177914,\n \"acc_stderr\": 0.03559039531617342,\n\
\ \"acc_norm\": 0.7116564417177914,\n \"acc_norm_stderr\": 0.03559039531617342\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.33035714285714285,\n\
\ \"acc_stderr\": 0.04464285714285714,\n \"acc_norm\": 0.33035714285714285,\n\
\ \"acc_norm_stderr\": 0.04464285714285714\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7281553398058253,\n \"acc_stderr\": 0.044052680241409216,\n\
\ \"acc_norm\": 0.7281553398058253,\n \"acc_norm_stderr\": 0.044052680241409216\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8034188034188035,\n\
\ \"acc_stderr\": 0.02603538609895129,\n \"acc_norm\": 0.8034188034188035,\n\
\ \"acc_norm_stderr\": 0.02603538609895129\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7637292464878672,\n\
\ \"acc_stderr\": 0.01519047371703751,\n \"acc_norm\": 0.7637292464878672,\n\
\ \"acc_norm_stderr\": 0.01519047371703751\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6416184971098265,\n \"acc_stderr\": 0.02581675679158419,\n\
\ \"acc_norm\": 0.6416184971098265,\n \"acc_norm_stderr\": 0.02581675679158419\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.47039106145251397,\n\
\ \"acc_stderr\": 0.016693154927383567,\n \"acc_norm\": 0.47039106145251397,\n\
\ \"acc_norm_stderr\": 0.016693154927383567\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6405228758169934,\n \"acc_stderr\": 0.027475969910660952,\n\
\ \"acc_norm\": 0.6405228758169934,\n \"acc_norm_stderr\": 0.027475969910660952\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6527331189710611,\n\
\ \"acc_stderr\": 0.027040745502307336,\n \"acc_norm\": 0.6527331189710611,\n\
\ \"acc_norm_stderr\": 0.027040745502307336\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.6327160493827161,\n \"acc_stderr\": 0.026822801759507894,\n\
\ \"acc_norm\": 0.6327160493827161,\n \"acc_norm_stderr\": 0.026822801759507894\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4219858156028369,\n \"acc_stderr\": 0.029462189233370593,\n \
\ \"acc_norm\": 0.4219858156028369,\n \"acc_norm_stderr\": 0.029462189233370593\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4302477183833116,\n\
\ \"acc_stderr\": 0.012645361435115231,\n \"acc_norm\": 0.4302477183833116,\n\
\ \"acc_norm_stderr\": 0.012645361435115231\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5404411764705882,\n \"acc_stderr\": 0.030273325077345755,\n\
\ \"acc_norm\": 0.5404411764705882,\n \"acc_norm_stderr\": 0.030273325077345755\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.5816993464052288,\n \"acc_stderr\": 0.019955975145835546,\n \
\ \"acc_norm\": 0.5816993464052288,\n \"acc_norm_stderr\": 0.019955975145835546\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.6204081632653061,\n \"acc_stderr\": 0.031067211262872468,\n\
\ \"acc_norm\": 0.6204081632653061,\n \"acc_norm_stderr\": 0.031067211262872468\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7412935323383084,\n\
\ \"acc_stderr\": 0.03096590312357302,\n \"acc_norm\": 0.7412935323383084,\n\
\ \"acc_norm_stderr\": 0.03096590312357302\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.83,\n \"acc_stderr\": 0.03775251680686371,\n \
\ \"acc_norm\": 0.83,\n \"acc_norm_stderr\": 0.03775251680686371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4819277108433735,\n\
\ \"acc_stderr\": 0.038899512528272166,\n \"acc_norm\": 0.4819277108433735,\n\
\ \"acc_norm_stderr\": 0.038899512528272166\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7953216374269005,\n \"acc_stderr\": 0.030944459778533197,\n\
\ \"acc_norm\": 0.7953216374269005,\n \"acc_norm_stderr\": 0.030944459778533197\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3390452876376989,\n\
\ \"mc1_stderr\": 0.016571797910626608,\n \"mc2\": 0.49371884206186833,\n\
\ \"mc2_stderr\": 0.015090933240631366\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7734806629834254,\n \"acc_stderr\": 0.011764149054698338\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.287338893100834,\n \
\ \"acc_stderr\": 0.012464677060107081\n }\n}\n```"
repo_url: https://huggingface.co/theNovaAI/Supernova-experimental
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-34-01.420352.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-34-01.420352.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- '**/details_harness|winogrande|5_2024-03-10T12-34-01.420352.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T12-34-01.420352.parquet'
- config_name: results
data_files:
- split: 2024_03_10T12_34_01.420352
path:
- results_2024-03-10T12-34-01.420352.parquet
- split: latest
path:
- results_2024-03-10T12-34-01.420352.parquet
---
# Dataset Card for Evaluation run of theNovaAI/Supernova-experimental
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [theNovaAI/Supernova-experimental](https://huggingface.co/theNovaAI/Supernova-experimental) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_theNovaAI__Supernova-experimental",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T12:34:01.420352](https://huggingface.co/datasets/open-llm-leaderboard/details_theNovaAI__Supernova-experimental/blob/main/results_2024-03-10T12-34-01.420352.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5663270464450889,
"acc_stderr": 0.03356166882892655,
"acc_norm": 0.5715895778655974,
"acc_norm_stderr": 0.03426551856832842,
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.49371884206186833,
"mc2_stderr": 0.015090933240631366
},
"harness|arc:challenge|25": {
"acc": 0.5921501706484642,
"acc_stderr": 0.014361097288449703,
"acc_norm": 0.6305460750853242,
"acc_norm_stderr": 0.014104578366491887
},
"harness|hellaswag|10": {
"acc": 0.6363274248157738,
"acc_stderr": 0.004800728138792395,
"acc_norm": 0.8365863373829915,
"acc_norm_stderr": 0.0036898701424130753
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5037037037037037,
"acc_stderr": 0.04319223625811331,
"acc_norm": 0.5037037037037037,
"acc_norm_stderr": 0.04319223625811331
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.5394736842105263,
"acc_stderr": 0.04056242252249034,
"acc_norm": 0.5394736842105263,
"acc_norm_stderr": 0.04056242252249034
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.56,
"acc_stderr": 0.04988876515698589,
"acc_norm": 0.56,
"acc_norm_stderr": 0.04988876515698589
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.5811320754716981,
"acc_stderr": 0.03036505082911521,
"acc_norm": 0.5811320754716981,
"acc_norm_stderr": 0.03036505082911521
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.5972222222222222,
"acc_stderr": 0.04101405519842426,
"acc_norm": 0.5972222222222222,
"acc_norm_stderr": 0.04101405519842426
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252604,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252604
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.36,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.36,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5491329479768786,
"acc_stderr": 0.0379401267469703,
"acc_norm": 0.5491329479768786,
"acc_norm_stderr": 0.0379401267469703
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.2647058823529412,
"acc_stderr": 0.04389869956808777,
"acc_norm": 0.2647058823529412,
"acc_norm_stderr": 0.04389869956808777
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.72,
"acc_stderr": 0.04512608598542129,
"acc_norm": 0.72,
"acc_norm_stderr": 0.04512608598542129
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.4595744680851064,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.4595744680851064,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5172413793103449,
"acc_stderr": 0.04164188720169375,
"acc_norm": 0.5172413793103449,
"acc_norm_stderr": 0.04164188720169375
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.328042328042328,
"acc_stderr": 0.024180497164376914,
"acc_norm": 0.328042328042328,
"acc_norm_stderr": 0.024180497164376914
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.373015873015873,
"acc_stderr": 0.04325506042017086,
"acc_norm": 0.373015873015873,
"acc_norm_stderr": 0.04325506042017086
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6580645161290323,
"acc_stderr": 0.026985289576552746,
"acc_norm": 0.6580645161290323,
"acc_norm_stderr": 0.026985289576552746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4482758620689655,
"acc_stderr": 0.03499113137676744,
"acc_norm": 0.4482758620689655,
"acc_norm_stderr": 0.03499113137676744
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.03663974994391244,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.03663974994391244
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7171717171717171,
"acc_stderr": 0.03208779558786752,
"acc_norm": 0.7171717171717171,
"acc_norm_stderr": 0.03208779558786752
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.7979274611398963,
"acc_stderr": 0.02897908979429673,
"acc_norm": 0.7979274611398963,
"acc_norm_stderr": 0.02897908979429673
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.517948717948718,
"acc_stderr": 0.025334667080954925,
"acc_norm": 0.517948717948718,
"acc_norm_stderr": 0.025334667080954925
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3296296296296296,
"acc_stderr": 0.028661201116524575,
"acc_norm": 0.3296296296296296,
"acc_norm_stderr": 0.028661201116524575
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6008403361344538,
"acc_stderr": 0.03181110032413926,
"acc_norm": 0.6008403361344538,
"acc_norm_stderr": 0.03181110032413926
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.31788079470198677,
"acc_stderr": 0.038020397601079024,
"acc_norm": 0.31788079470198677,
"acc_norm_stderr": 0.038020397601079024
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7486238532110092,
"acc_stderr": 0.018599206360287415,
"acc_norm": 0.7486238532110092,
"acc_norm_stderr": 0.018599206360287415
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.75,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7721518987341772,
"acc_stderr": 0.02730348459906943,
"acc_norm": 0.7721518987341772,
"acc_norm_stderr": 0.02730348459906943
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6412213740458015,
"acc_stderr": 0.04206739313864908,
"acc_norm": 0.6412213740458015,
"acc_norm_stderr": 0.04206739313864908
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.743801652892562,
"acc_stderr": 0.039849796533028725,
"acc_norm": 0.743801652892562,
"acc_norm_stderr": 0.039849796533028725
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7314814814814815,
"acc_stderr": 0.042844679680521934,
"acc_norm": 0.7314814814814815,
"acc_norm_stderr": 0.042844679680521934
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7116564417177914,
"acc_stderr": 0.03559039531617342,
"acc_norm": 0.7116564417177914,
"acc_norm_stderr": 0.03559039531617342
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.33035714285714285,
"acc_stderr": 0.04464285714285714,
"acc_norm": 0.33035714285714285,
"acc_norm_stderr": 0.04464285714285714
},
"harness|hendrycksTest-management|5": {
"acc": 0.7281553398058253,
"acc_stderr": 0.044052680241409216,
"acc_norm": 0.7281553398058253,
"acc_norm_stderr": 0.044052680241409216
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8034188034188035,
"acc_stderr": 0.02603538609895129,
"acc_norm": 0.8034188034188035,
"acc_norm_stderr": 0.02603538609895129
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7637292464878672,
"acc_stderr": 0.01519047371703751,
"acc_norm": 0.7637292464878672,
"acc_norm_stderr": 0.01519047371703751
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6416184971098265,
"acc_stderr": 0.02581675679158419,
"acc_norm": 0.6416184971098265,
"acc_norm_stderr": 0.02581675679158419
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.47039106145251397,
"acc_stderr": 0.016693154927383567,
"acc_norm": 0.47039106145251397,
"acc_norm_stderr": 0.016693154927383567
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6405228758169934,
"acc_stderr": 0.027475969910660952,
"acc_norm": 0.6405228758169934,
"acc_norm_stderr": 0.027475969910660952
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6527331189710611,
"acc_stderr": 0.027040745502307336,
"acc_norm": 0.6527331189710611,
"acc_norm_stderr": 0.027040745502307336
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.6327160493827161,
"acc_stderr": 0.026822801759507894,
"acc_norm": 0.6327160493827161,
"acc_norm_stderr": 0.026822801759507894
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4219858156028369,
"acc_stderr": 0.029462189233370593,
"acc_norm": 0.4219858156028369,
"acc_norm_stderr": 0.029462189233370593
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4302477183833116,
"acc_stderr": 0.012645361435115231,
"acc_norm": 0.4302477183833116,
"acc_norm_stderr": 0.012645361435115231
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5404411764705882,
"acc_stderr": 0.030273325077345755,
"acc_norm": 0.5404411764705882,
"acc_norm_stderr": 0.030273325077345755
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.5816993464052288,
"acc_stderr": 0.019955975145835546,
"acc_norm": 0.5816993464052288,
"acc_norm_stderr": 0.019955975145835546
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.6204081632653061,
"acc_stderr": 0.031067211262872468,
"acc_norm": 0.6204081632653061,
"acc_norm_stderr": 0.031067211262872468
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7412935323383084,
"acc_stderr": 0.03096590312357302,
"acc_norm": 0.7412935323383084,
"acc_norm_stderr": 0.03096590312357302
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.83,
"acc_stderr": 0.03775251680686371,
"acc_norm": 0.83,
"acc_norm_stderr": 0.03775251680686371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4819277108433735,
"acc_stderr": 0.038899512528272166,
"acc_norm": 0.4819277108433735,
"acc_norm_stderr": 0.038899512528272166
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7953216374269005,
"acc_stderr": 0.030944459778533197,
"acc_norm": 0.7953216374269005,
"acc_norm_stderr": 0.030944459778533197
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3390452876376989,
"mc1_stderr": 0.016571797910626608,
"mc2": 0.49371884206186833,
"mc2_stderr": 0.015090933240631366
},
"harness|winogrande|5": {
"acc": 0.7734806629834254,
"acc_stderr": 0.011764149054698338
},
"harness|gsm8k|5": {
"acc": 0.287338893100834,
"acc_stderr": 0.012464677060107081
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NorGLM/NO-BoolQ | NorGLM | "2024-10-01T18:37:14Z" | 0 | 0 | [
"language:no",
"license:cc-by-sa-3.0",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"arxiv:2312.01314",
"region:us"
] | null | "2024-03-10T12:53:40Z" | ---
license: cc-by-sa-3.0
language:
- 'no'
---
## Dataset Card for NO-BoolQ ##
NO-BoolQ is machine translated from [Google Boolq dataset](https://huggingface.co/datasets/google/boolq). It is a question answering dataset split with train, test and validation set the same with it's original dataset.
This dataset belongs to NLEBench Norwegian benchmarks for evaluation on Norwegian Natrual Language Undersanding (NLU) tasks.
## Licensing Information
This dataset is built upon the existing datasets. We therefore follow its original license information.
## Citation Information
If you feel our work is helpful, please cite our paper:
```
@article{liu2023nlebench+,
title={NLEBench+ NorGLM: A Comprehensive Empirical Analysis and Benchmark Dataset for Generative Language Models in Norwegian},
author={Liu, Peng and Zhang, Lemei and Farup, Terje Nissen and Lauvrak, Even W and Ingvaldsen, Jon Espen and Eide, Simen and Gulla, Jon Atle and Yang, Zhirong},
journal={arXiv preprint arXiv:2312.01314},
year={2023}
}
``` |
japanese-asr/whisper_transcriptions.reazonspeech.large | japanese-asr | "2024-03-11T14:43:17Z" | 0 | 0 | [
"size_categories:1M<n<10M",
"format:parquet",
"modality:audio",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T12:58:27Z" | ---
dataset_info:
config_name: large
features:
- name: name
dtype: string
- name: audio
dtype:
audio:
sampling_rate: 16000
- name: transcription
dtype: string
- name: whisper_transcript
sequence: int64
splits:
- name: train
num_bytes: 352287732593.0
num_examples: 3096342
download_size: 349629407588
dataset_size: 352287732593.0
configs:
- config_name: large
data_files:
- split: train
path: large/train-*
---
|
open-llm-leaderboard-old/details_abacusai__bigyi-15b | open-llm-leaderboard-old | "2024-05-04T23:30:57Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T12:59:59Z" | ---
pretty_name: Evaluation run of abacusai/bigyi-15b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [abacusai/bigyi-15b](https://huggingface.co/abacusai/bigyi-15b) on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_abacusai__bigyi-15b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T12:57:48.733111](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__bigyi-15b/blob/main/results_2024-03-10T12-57-48.733111.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6368698545001125,\n\
\ \"acc_stderr\": 0.03234846946993144,\n \"acc_norm\": 0.6464868762775356,\n\
\ \"acc_norm_stderr\": 0.03302252322767117,\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.37326698740277925,\n\
\ \"mc2_stderr\": 0.01461558650400129\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5324232081911263,\n \"acc_stderr\": 0.014580637569995426,\n\
\ \"acc_norm\": 0.560580204778157,\n \"acc_norm_stderr\": 0.014503747823580127\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.5749850627365066,\n\
\ \"acc_stderr\": 0.004933349621589336,\n \"acc_norm\": 0.7590121489743079,\n\
\ \"acc_norm_stderr\": 0.004268088879039825\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6973684210526315,\n \"acc_stderr\": 0.03738520676119667,\n\
\ \"acc_norm\": 0.6973684210526315,\n \"acc_norm_stderr\": 0.03738520676119667\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.73,\n\
\ \"acc_stderr\": 0.04461960433384741,\n \"acc_norm\": 0.73,\n \
\ \"acc_norm_stderr\": 0.04461960433384741\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7847222222222222,\n\
\ \"acc_stderr\": 0.03437079344106135,\n \"acc_norm\": 0.7847222222222222,\n\
\ \"acc_norm_stderr\": 0.03437079344106135\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.47,\n \"acc_stderr\": 0.050161355804659205,\n \
\ \"acc_norm\": 0.47,\n \"acc_norm_stderr\": 0.050161355804659205\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.55,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.55,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.43,\n \"acc_stderr\": 0.049756985195624284,\n \
\ \"acc_norm\": 0.43,\n \"acc_norm_stderr\": 0.049756985195624284\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6473988439306358,\n\
\ \"acc_stderr\": 0.03643037168958548,\n \"acc_norm\": 0.6473988439306358,\n\
\ \"acc_norm_stderr\": 0.03643037168958548\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.78,\n \"acc_stderr\": 0.04163331998932264,\n \"acc_norm\": 0.78,\n\
\ \"acc_norm_stderr\": 0.04163331998932264\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.625531914893617,\n \"acc_stderr\": 0.031639106653672915,\n\
\ \"acc_norm\": 0.625531914893617,\n \"acc_norm_stderr\": 0.031639106653672915\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.45614035087719296,\n\
\ \"acc_stderr\": 0.046854730419077895,\n \"acc_norm\": 0.45614035087719296,\n\
\ \"acc_norm_stderr\": 0.046854730419077895\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.6206896551724138,\n \"acc_stderr\": 0.040434618619167466,\n\
\ \"acc_norm\": 0.6206896551724138,\n \"acc_norm_stderr\": 0.040434618619167466\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.5317460317460317,\n \"acc_stderr\": 0.025699352832131792,\n \"\
acc_norm\": 0.5317460317460317,\n \"acc_norm_stderr\": 0.025699352832131792\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.047258156262526045,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.047258156262526045\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7806451612903226,\n \"acc_stderr\": 0.023540799358723295,\n \"\
acc_norm\": 0.7806451612903226,\n \"acc_norm_stderr\": 0.023540799358723295\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n \"\
acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816507,\n \"acc_norm\"\
: 0.77,\n \"acc_norm_stderr\": 0.04229525846816507\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7818181818181819,\n \"acc_stderr\": 0.032250781083062896,\n\
\ \"acc_norm\": 0.7818181818181819,\n \"acc_norm_stderr\": 0.032250781083062896\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.797979797979798,\n \"acc_stderr\": 0.028606204289229872,\n \"\
acc_norm\": 0.797979797979798,\n \"acc_norm_stderr\": 0.028606204289229872\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709443,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709443\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6717948717948717,\n \"acc_stderr\": 0.023807633198657262,\n\
\ \"acc_norm\": 0.6717948717948717,\n \"acc_norm_stderr\": 0.023807633198657262\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.37037037037037035,\n \"acc_stderr\": 0.02944316932303154,\n \
\ \"acc_norm\": 0.37037037037037035,\n \"acc_norm_stderr\": 0.02944316932303154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6932773109243697,\n \"acc_stderr\": 0.02995382389188704,\n \
\ \"acc_norm\": 0.6932773109243697,\n \"acc_norm_stderr\": 0.02995382389188704\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3841059602649007,\n \"acc_stderr\": 0.03971301814719197,\n \"\
acc_norm\": 0.3841059602649007,\n \"acc_norm_stderr\": 0.03971301814719197\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8458715596330275,\n \"acc_stderr\": 0.015480826865374291,\n \"\
acc_norm\": 0.8458715596330275,\n \"acc_norm_stderr\": 0.015480826865374291\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5416666666666666,\n \"acc_stderr\": 0.03398110890294636,\n \"\
acc_norm\": 0.5416666666666666,\n \"acc_norm_stderr\": 0.03398110890294636\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8088235294117647,\n \"acc_stderr\": 0.027599174300640773,\n \"\
acc_norm\": 0.8088235294117647,\n \"acc_norm_stderr\": 0.027599174300640773\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7679324894514767,\n \"acc_stderr\": 0.027479744550808514,\n \
\ \"acc_norm\": 0.7679324894514767,\n \"acc_norm_stderr\": 0.027479744550808514\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7309417040358744,\n\
\ \"acc_stderr\": 0.02976377940687497,\n \"acc_norm\": 0.7309417040358744,\n\
\ \"acc_norm_stderr\": 0.02976377940687497\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7404580152671756,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.7404580152671756,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.71900826446281,\n \"acc_stderr\": 0.04103203830514511,\n \"acc_norm\"\
: 0.71900826446281,\n \"acc_norm_stderr\": 0.04103203830514511\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7037037037037037,\n\
\ \"acc_stderr\": 0.044143436668549335,\n \"acc_norm\": 0.7037037037037037,\n\
\ \"acc_norm_stderr\": 0.044143436668549335\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.48214285714285715,\n\
\ \"acc_stderr\": 0.047427623612430116,\n \"acc_norm\": 0.48214285714285715,\n\
\ \"acc_norm_stderr\": 0.047427623612430116\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.8155339805825242,\n \"acc_stderr\": 0.03840423627288276,\n\
\ \"acc_norm\": 0.8155339805825242,\n \"acc_norm_stderr\": 0.03840423627288276\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507332,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507332\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7982120051085568,\n\
\ \"acc_stderr\": 0.014351702181636861,\n \"acc_norm\": 0.7982120051085568,\n\
\ \"acc_norm_stderr\": 0.014351702181636861\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.684971098265896,\n \"acc_stderr\": 0.025009313790069716,\n\
\ \"acc_norm\": 0.684971098265896,\n \"acc_norm_stderr\": 0.025009313790069716\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.33854748603351953,\n\
\ \"acc_stderr\": 0.01582670009648135,\n \"acc_norm\": 0.33854748603351953,\n\
\ \"acc_norm_stderr\": 0.01582670009648135\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7156862745098039,\n \"acc_stderr\": 0.02582916327275748,\n\
\ \"acc_norm\": 0.7156862745098039,\n \"acc_norm_stderr\": 0.02582916327275748\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7138263665594855,\n\
\ \"acc_stderr\": 0.025670259242188947,\n \"acc_norm\": 0.7138263665594855,\n\
\ \"acc_norm_stderr\": 0.025670259242188947\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7160493827160493,\n \"acc_stderr\": 0.025089478523765134,\n\
\ \"acc_norm\": 0.7160493827160493,\n \"acc_norm_stderr\": 0.025089478523765134\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.5141843971631206,\n \"acc_stderr\": 0.02981549448368206,\n \
\ \"acc_norm\": 0.5141843971631206,\n \"acc_norm_stderr\": 0.02981549448368206\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4654498044328553,\n\
\ \"acc_stderr\": 0.012739711554045708,\n \"acc_norm\": 0.4654498044328553,\n\
\ \"acc_norm_stderr\": 0.012739711554045708\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6544117647058824,\n \"acc_stderr\": 0.028888193103988637,\n\
\ \"acc_norm\": 0.6544117647058824,\n \"acc_norm_stderr\": 0.028888193103988637\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6486928104575164,\n \"acc_stderr\": 0.01931267606578655,\n \
\ \"acc_norm\": 0.6486928104575164,\n \"acc_norm_stderr\": 0.01931267606578655\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.7,\n\
\ \"acc_stderr\": 0.04389311454644287,\n \"acc_norm\": 0.7,\n \
\ \"acc_norm_stderr\": 0.04389311454644287\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7020408163265306,\n \"acc_stderr\": 0.029279567411065674,\n\
\ \"acc_norm\": 0.7020408163265306,\n \"acc_norm_stderr\": 0.029279567411065674\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7960199004975125,\n\
\ \"acc_stderr\": 0.02849317624532607,\n \"acc_norm\": 0.7960199004975125,\n\
\ \"acc_norm_stderr\": 0.02849317624532607\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.03588702812826371,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.03588702812826371\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5240963855421686,\n\
\ \"acc_stderr\": 0.03887971849597264,\n \"acc_norm\": 0.5240963855421686,\n\
\ \"acc_norm_stderr\": 0.03887971849597264\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8070175438596491,\n \"acc_stderr\": 0.030267457554898458,\n\
\ \"acc_norm\": 0.8070175438596491,\n \"acc_norm_stderr\": 0.030267457554898458\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.2423500611995104,\n\
\ \"mc1_stderr\": 0.015000674373570338,\n \"mc2\": 0.37326698740277925,\n\
\ \"mc2_stderr\": 0.01461558650400129\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7024467245461721,\n \"acc_stderr\": 0.012849085254614657\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.21607278241091737,\n \
\ \"acc_stderr\": 0.011336531489638873\n }\n}\n```"
repo_url: https://huggingface.co/abacusai/bigyi-15b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T12-57-48.733111.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- '**/details_harness|winogrande|5_2024-03-10T12-57-48.733111.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T12-57-48.733111.parquet'
- config_name: results
data_files:
- split: 2024_03_10T12_57_48.733111
path:
- results_2024-03-10T12-57-48.733111.parquet
- split: latest
path:
- results_2024-03-10T12-57-48.733111.parquet
---
# Dataset Card for Evaluation run of abacusai/bigyi-15b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [abacusai/bigyi-15b](https://huggingface.co/abacusai/bigyi-15b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_abacusai__bigyi-15b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T12:57:48.733111](https://huggingface.co/datasets/open-llm-leaderboard/details_abacusai__bigyi-15b/blob/main/results_2024-03-10T12-57-48.733111.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6368698545001125,
"acc_stderr": 0.03234846946993144,
"acc_norm": 0.6464868762775356,
"acc_norm_stderr": 0.03302252322767117,
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.37326698740277925,
"mc2_stderr": 0.01461558650400129
},
"harness|arc:challenge|25": {
"acc": 0.5324232081911263,
"acc_stderr": 0.014580637569995426,
"acc_norm": 0.560580204778157,
"acc_norm_stderr": 0.014503747823580127
},
"harness|hellaswag|10": {
"acc": 0.5749850627365066,
"acc_stderr": 0.004933349621589336,
"acc_norm": 0.7590121489743079,
"acc_norm_stderr": 0.004268088879039825
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6973684210526315,
"acc_stderr": 0.03738520676119667,
"acc_norm": 0.6973684210526315,
"acc_norm_stderr": 0.03738520676119667
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.73,
"acc_stderr": 0.04461960433384741,
"acc_norm": 0.73,
"acc_norm_stderr": 0.04461960433384741
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7847222222222222,
"acc_stderr": 0.03437079344106135,
"acc_norm": 0.7847222222222222,
"acc_norm_stderr": 0.03437079344106135
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.47,
"acc_stderr": 0.050161355804659205,
"acc_norm": 0.47,
"acc_norm_stderr": 0.050161355804659205
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.05,
"acc_norm": 0.55,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.43,
"acc_stderr": 0.049756985195624284,
"acc_norm": 0.43,
"acc_norm_stderr": 0.049756985195624284
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6473988439306358,
"acc_stderr": 0.03643037168958548,
"acc_norm": 0.6473988439306358,
"acc_norm_stderr": 0.03643037168958548
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.78,
"acc_stderr": 0.04163331998932264,
"acc_norm": 0.78,
"acc_norm_stderr": 0.04163331998932264
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.625531914893617,
"acc_stderr": 0.031639106653672915,
"acc_norm": 0.625531914893617,
"acc_norm_stderr": 0.031639106653672915
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.45614035087719296,
"acc_stderr": 0.046854730419077895,
"acc_norm": 0.45614035087719296,
"acc_norm_stderr": 0.046854730419077895
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.6206896551724138,
"acc_stderr": 0.040434618619167466,
"acc_norm": 0.6206896551724138,
"acc_norm_stderr": 0.040434618619167466
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.025699352832131792,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.025699352832131792
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.33,
"acc_stderr": 0.047258156262526045,
"acc_norm": 0.33,
"acc_norm_stderr": 0.047258156262526045
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7806451612903226,
"acc_stderr": 0.023540799358723295,
"acc_norm": 0.7806451612903226,
"acc_norm_stderr": 0.023540799358723295
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816507,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816507
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7818181818181819,
"acc_stderr": 0.032250781083062896,
"acc_norm": 0.7818181818181819,
"acc_norm_stderr": 0.032250781083062896
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.797979797979798,
"acc_stderr": 0.028606204289229872,
"acc_norm": 0.797979797979798,
"acc_norm_stderr": 0.028606204289229872
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709443,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709443
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6717948717948717,
"acc_stderr": 0.023807633198657262,
"acc_norm": 0.6717948717948717,
"acc_norm_stderr": 0.023807633198657262
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.37037037037037035,
"acc_stderr": 0.02944316932303154,
"acc_norm": 0.37037037037037035,
"acc_norm_stderr": 0.02944316932303154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6932773109243697,
"acc_stderr": 0.02995382389188704,
"acc_norm": 0.6932773109243697,
"acc_norm_stderr": 0.02995382389188704
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3841059602649007,
"acc_stderr": 0.03971301814719197,
"acc_norm": 0.3841059602649007,
"acc_norm_stderr": 0.03971301814719197
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.015480826865374291,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.015480826865374291
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5416666666666666,
"acc_stderr": 0.03398110890294636,
"acc_norm": 0.5416666666666666,
"acc_norm_stderr": 0.03398110890294636
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8088235294117647,
"acc_stderr": 0.027599174300640773,
"acc_norm": 0.8088235294117647,
"acc_norm_stderr": 0.027599174300640773
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7679324894514767,
"acc_stderr": 0.027479744550808514,
"acc_norm": 0.7679324894514767,
"acc_norm_stderr": 0.027479744550808514
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7309417040358744,
"acc_stderr": 0.02976377940687497,
"acc_norm": 0.7309417040358744,
"acc_norm_stderr": 0.02976377940687497
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7404580152671756,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.7404580152671756,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.71900826446281,
"acc_stderr": 0.04103203830514511,
"acc_norm": 0.71900826446281,
"acc_norm_stderr": 0.04103203830514511
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7037037037037037,
"acc_stderr": 0.044143436668549335,
"acc_norm": 0.7037037037037037,
"acc_norm_stderr": 0.044143436668549335
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.48214285714285715,
"acc_stderr": 0.047427623612430116,
"acc_norm": 0.48214285714285715,
"acc_norm_stderr": 0.047427623612430116
},
"harness|hendrycksTest-management|5": {
"acc": 0.8155339805825242,
"acc_stderr": 0.03840423627288276,
"acc_norm": 0.8155339805825242,
"acc_norm_stderr": 0.03840423627288276
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507332,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507332
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7982120051085568,
"acc_stderr": 0.014351702181636861,
"acc_norm": 0.7982120051085568,
"acc_norm_stderr": 0.014351702181636861
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.684971098265896,
"acc_stderr": 0.025009313790069716,
"acc_norm": 0.684971098265896,
"acc_norm_stderr": 0.025009313790069716
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.33854748603351953,
"acc_stderr": 0.01582670009648135,
"acc_norm": 0.33854748603351953,
"acc_norm_stderr": 0.01582670009648135
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7156862745098039,
"acc_stderr": 0.02582916327275748,
"acc_norm": 0.7156862745098039,
"acc_norm_stderr": 0.02582916327275748
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7138263665594855,
"acc_stderr": 0.025670259242188947,
"acc_norm": 0.7138263665594855,
"acc_norm_stderr": 0.025670259242188947
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7160493827160493,
"acc_stderr": 0.025089478523765134,
"acc_norm": 0.7160493827160493,
"acc_norm_stderr": 0.025089478523765134
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.5141843971631206,
"acc_stderr": 0.02981549448368206,
"acc_norm": 0.5141843971631206,
"acc_norm_stderr": 0.02981549448368206
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4654498044328553,
"acc_stderr": 0.012739711554045708,
"acc_norm": 0.4654498044328553,
"acc_norm_stderr": 0.012739711554045708
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6544117647058824,
"acc_stderr": 0.028888193103988637,
"acc_norm": 0.6544117647058824,
"acc_norm_stderr": 0.028888193103988637
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6486928104575164,
"acc_stderr": 0.01931267606578655,
"acc_norm": 0.6486928104575164,
"acc_norm_stderr": 0.01931267606578655
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.7,
"acc_stderr": 0.04389311454644287,
"acc_norm": 0.7,
"acc_norm_stderr": 0.04389311454644287
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7020408163265306,
"acc_stderr": 0.029279567411065674,
"acc_norm": 0.7020408163265306,
"acc_norm_stderr": 0.029279567411065674
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7960199004975125,
"acc_stderr": 0.02849317624532607,
"acc_norm": 0.7960199004975125,
"acc_norm_stderr": 0.02849317624532607
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.03588702812826371,
"acc_norm": 0.85,
"acc_norm_stderr": 0.03588702812826371
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5240963855421686,
"acc_stderr": 0.03887971849597264,
"acc_norm": 0.5240963855421686,
"acc_norm_stderr": 0.03887971849597264
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8070175438596491,
"acc_stderr": 0.030267457554898458,
"acc_norm": 0.8070175438596491,
"acc_norm_stderr": 0.030267457554898458
},
"harness|truthfulqa:mc|0": {
"mc1": 0.2423500611995104,
"mc1_stderr": 0.015000674373570338,
"mc2": 0.37326698740277925,
"mc2_stderr": 0.01461558650400129
},
"harness|winogrande|5": {
"acc": 0.7024467245461721,
"acc_stderr": 0.012849085254614657
},
"harness|gsm8k|5": {
"acc": 0.21607278241091737,
"acc_stderr": 0.011336531489638873
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
zjunlp/ConceptEdit | zjunlp | "2024-03-12T15:04:59Z" | 0 | 4 | [
"license:cc-by-nc-sa-4.0",
"arxiv:2403.06259",
"region:us"
] | null | "2024-03-10T13:08:00Z" | ---
license: cc-by-nc-sa-4.0
---
<div align="center">
**Editing Conceptual Knowledge for Large Language Models**
---
<p align="center">
<a href="#-conceptual-knowledge-editing">Overview</a> •
<a href="#-usage">How To Use</a> •
<a href="#-citation">Citation</a> •
<a href="https://arxiv.org/abs/2403.06259">Paper</a> •
<a href="https://zjunlp.github.io/project/ConceptEdit">Website</a>
</p>
</div>
## 💡 Conceptual Knowledge Editing
<div align=center>
<img src="./flow1.gif" width="70%" height="70%" />
</div>
### Task Definition
**Concept** is a generalization of the world in the process of cognition, which represents the shared features and essential characteristics of a class of entities.
Therefore, the endeavor of concept editing aims to modify the definition of concepts, thereby altering the behavior of LLMs when processing these concepts.
### Evaluation
To analyze conceptual knowledge modification, we adopt the metrics for factual editing (the target is the concept $C$ rather than factual instance $t$).
- `Reliability`: the success rate of editing with a given editing description
- `Generalization`: the success rate of editing **within** the editing scope
- `Locality`: whether the model's output changes after editing for unrelated inputs
Concept Specific Evaluation Metrics
- `Instance Change`: capturing the intricacies of these instance-level changes
- `Concept Consistency`: the semantic similarity of generated concept definition
## 🌟 Usage
### 🎍 Current Implementation
As the main Table of our paper, four editing methods are supported for conceptual knowledge editing.
| **Method** | GPT-2 | GPT-J | LlaMA2-13B-Chat | Mistral-7B-v0.1
| :--------------: | :--------------: | :--------------: | :--------------: | :--------------: |
| FT | ✅ | ✅ | ✅ | ✅ |
| ROME | ✅ | ✅ |✅ | ✅ |
| MEMIT | ✅ | ✅ | ✅| ✅ |
| PROMPT | ✅ | ✅ | ✅ | ✅ |
### 💻 Run
You can follow [EasyEdit](https://github.com/zjunlp/EasyEdit/edit/main/examples/ConceptEdit.md) to run the experiments.
## 📖 Citation
Please cite our paper if you use **ConceptEdit** in your work.
```bibtex
@misc{wang2024editing,
title={Editing Conceptual Knowledge for Large Language Models},
author={Xiaohan Wang and Shengyu Mao and Ningyu Zhang and Shumin Deng and Yunzhi Yao and Yue Shen and Lei Liang and Jinjie Gu and Huajun Chen},
year={2024},
eprint={2403.06259},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
```
## 🎉 Acknowledgement
We would like to express our sincere gratitude to [DBpedia](https://www.dbpedia.org/resources/ontology/),[Wikidata](https://www.wikidata.org/wiki/Wikidata:Introduction),[OntoProbe-PLMs](https://github.com/vickywu1022/OntoProbe-PLMs) and [ROME](https://github.com/kmeng01/rome).
Their contributions are invaluable to the advancement of our work.
|
open-llm-leaderboard-old/details_FuseAI__FuseChat-7B-TA | open-llm-leaderboard-old | "2024-03-10T13:12:23Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T13:12:02Z" | ---
pretty_name: Evaluation run of FuseAI/FuseChat-7B-TA
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [FuseAI/FuseChat-7B-TA](https://huggingface.co/FuseAI/FuseChat-7B-TA) on the [Open\
\ LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_FuseAI__FuseChat-7B-TA\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T13:09:42.780252](https://huggingface.co/datasets/open-llm-leaderboard/details_FuseAI__FuseChat-7B-TA/blob/main/results_2024-03-10T13-09-42.780252.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.640970662790146,\n\
\ \"acc_stderr\": 0.032240544947752886,\n \"acc_norm\": 0.6427814675983042,\n\
\ \"acc_norm_stderr\": 0.03289036996539776,\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323016,\n \"mc2\": 0.45736495798430604,\n\
\ \"mc2_stderr\": 0.015085741763833444\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5793515358361775,\n \"acc_stderr\": 0.014426211252508397,\n\
\ \"acc_norm\": 0.6254266211604096,\n \"acc_norm_stderr\": 0.014144193471893452\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6491734714200359,\n\
\ \"acc_stderr\": 0.004762534245488402,\n \"acc_norm\": 0.842162915753834,\n\
\ \"acc_norm_stderr\": 0.0036384306206139272\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5851851851851851,\n\
\ \"acc_stderr\": 0.04256193767901408,\n \"acc_norm\": 0.5851851851851851,\n\
\ \"acc_norm_stderr\": 0.04256193767901408\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.64,\n\
\ \"acc_stderr\": 0.04824181513244218,\n \"acc_norm\": 0.64,\n \
\ \"acc_norm_stderr\": 0.04824181513244218\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6867924528301886,\n \"acc_stderr\": 0.028544793319055326,\n\
\ \"acc_norm\": 0.6867924528301886,\n \"acc_norm_stderr\": 0.028544793319055326\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.75,\n\
\ \"acc_stderr\": 0.03621034121889507,\n \"acc_norm\": 0.75,\n \
\ \"acc_norm_stderr\": 0.03621034121889507\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\": 0.45,\n \
\ \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.43137254901960786,\n \"acc_stderr\": 0.04928099597287534,\n\
\ \"acc_norm\": 0.43137254901960786,\n \"acc_norm_stderr\": 0.04928099597287534\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.74,\n \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.74,\n\
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5702127659574469,\n \"acc_stderr\": 0.03236214467715564,\n\
\ \"acc_norm\": 0.5702127659574469,\n \"acc_norm_stderr\": 0.03236214467715564\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.5350877192982456,\n\
\ \"acc_stderr\": 0.046920083813689104,\n \"acc_norm\": 0.5350877192982456,\n\
\ \"acc_norm_stderr\": 0.046920083813689104\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5517241379310345,\n \"acc_stderr\": 0.04144311810878151,\n\
\ \"acc_norm\": 0.5517241379310345,\n \"acc_norm_stderr\": 0.04144311810878151\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.02546714904546955,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.02546714904546955\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.5317460317460317,\n\
\ \"acc_stderr\": 0.04463112720677172,\n \"acc_norm\": 0.5317460317460317,\n\
\ \"acc_norm_stderr\": 0.04463112720677172\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.28,\n \"acc_stderr\": 0.045126085985421276,\n \
\ \"acc_norm\": 0.28,\n \"acc_norm_stderr\": 0.045126085985421276\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7903225806451613,\n \"acc_stderr\": 0.023157879349083525,\n \"\
acc_norm\": 0.7903225806451613,\n \"acc_norm_stderr\": 0.023157879349083525\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.49261083743842365,\n \"acc_stderr\": 0.035176035403610084,\n \"\
acc_norm\": 0.49261083743842365,\n \"acc_norm_stderr\": 0.035176035403610084\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\"\
: 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7777777777777778,\n \"acc_stderr\": 0.02962022787479048,\n \"\
acc_norm\": 0.7777777777777778,\n \"acc_norm_stderr\": 0.02962022787479048\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.917098445595855,\n \"acc_stderr\": 0.01989934131572178,\n\
\ \"acc_norm\": 0.917098445595855,\n \"acc_norm_stderr\": 0.01989934131572178\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.658974358974359,\n \"acc_stderr\": 0.02403548967633508,\n \
\ \"acc_norm\": 0.658974358974359,\n \"acc_norm_stderr\": 0.02403548967633508\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.3333333333333333,\n \"acc_stderr\": 0.028742040903948492,\n \
\ \"acc_norm\": 0.3333333333333333,\n \"acc_norm_stderr\": 0.028742040903948492\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6638655462184874,\n \"acc_stderr\": 0.030684737115135353,\n\
\ \"acc_norm\": 0.6638655462184874,\n \"acc_norm_stderr\": 0.030684737115135353\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.304635761589404,\n \"acc_stderr\": 0.03757949922943343,\n \"acc_norm\"\
: 0.304635761589404,\n \"acc_norm_stderr\": 0.03757949922943343\n },\n\
\ \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\": 0.8458715596330275,\n\
\ \"acc_stderr\": 0.0154808268653743,\n \"acc_norm\": 0.8458715596330275,\n\
\ \"acc_norm_stderr\": 0.0154808268653743\n },\n \"harness|hendrycksTest-high_school_statistics|5\"\
: {\n \"acc\": 0.5370370370370371,\n \"acc_stderr\": 0.03400603625538272,\n\
\ \"acc_norm\": 0.5370370370370371,\n \"acc_norm_stderr\": 0.03400603625538272\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7892156862745098,\n \"acc_stderr\": 0.02862654791243742,\n \"\
acc_norm\": 0.7892156862745098,\n \"acc_norm_stderr\": 0.02862654791243742\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621115,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621115\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.7174887892376681,\n\
\ \"acc_stderr\": 0.030216831011508762,\n \"acc_norm\": 0.7174887892376681,\n\
\ \"acc_norm_stderr\": 0.030216831011508762\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7557251908396947,\n \"acc_stderr\": 0.037683359597287434,\n\
\ \"acc_norm\": 0.7557251908396947,\n \"acc_norm_stderr\": 0.037683359597287434\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7520661157024794,\n \"acc_stderr\": 0.03941897526516303,\n \"\
acc_norm\": 0.7520661157024794,\n \"acc_norm_stderr\": 0.03941897526516303\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252627,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252627\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.039891398595317706,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.039891398595317706\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8675213675213675,\n\
\ \"acc_stderr\": 0.022209309073165612,\n \"acc_norm\": 0.8675213675213675,\n\
\ \"acc_norm_stderr\": 0.022209309073165612\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8109833971902938,\n\
\ \"acc_stderr\": 0.014000791294406999,\n \"acc_norm\": 0.8109833971902938,\n\
\ \"acc_norm_stderr\": 0.014000791294406999\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7254335260115607,\n \"acc_stderr\": 0.02402774515526502,\n\
\ \"acc_norm\": 0.7254335260115607,\n \"acc_norm_stderr\": 0.02402774515526502\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4,\n\
\ \"acc_stderr\": 0.01638463841038082,\n \"acc_norm\": 0.4,\n \
\ \"acc_norm_stderr\": 0.01638463841038082\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7320261437908496,\n \"acc_stderr\": 0.025360603796242557,\n\
\ \"acc_norm\": 0.7320261437908496,\n \"acc_norm_stderr\": 0.025360603796242557\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7202572347266881,\n\
\ \"acc_stderr\": 0.025494259350694912,\n \"acc_norm\": 0.7202572347266881,\n\
\ \"acc_norm_stderr\": 0.025494259350694912\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7222222222222222,\n \"acc_stderr\": 0.024922001168886335,\n\
\ \"acc_norm\": 0.7222222222222222,\n \"acc_norm_stderr\": 0.024922001168886335\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.46808510638297873,\n \"acc_stderr\": 0.029766675075873862,\n \
\ \"acc_norm\": 0.46808510638297873,\n \"acc_norm_stderr\": 0.029766675075873862\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4784876140808344,\n\
\ \"acc_stderr\": 0.012758410941038915,\n \"acc_norm\": 0.4784876140808344,\n\
\ \"acc_norm_stderr\": 0.012758410941038915\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6699346405228758,\n \"acc_stderr\": 0.019023726160724553,\n \
\ \"acc_norm\": 0.6699346405228758,\n \"acc_norm_stderr\": 0.019023726160724553\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8308457711442786,\n\
\ \"acc_stderr\": 0.02650859065623327,\n \"acc_norm\": 0.8308457711442786,\n\
\ \"acc_norm_stderr\": 0.02650859065623327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5180722891566265,\n\
\ \"acc_stderr\": 0.03889951252827216,\n \"acc_norm\": 0.5180722891566265,\n\
\ \"acc_norm_stderr\": 0.03889951252827216\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.847953216374269,\n \"acc_stderr\": 0.027539122889061456,\n\
\ \"acc_norm\": 0.847953216374269,\n \"acc_norm_stderr\": 0.027539122889061456\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.3072215422276622,\n\
\ \"mc1_stderr\": 0.016150201321323016,\n \"mc2\": 0.45736495798430604,\n\
\ \"mc2_stderr\": 0.015085741763833444\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7940015785319653,\n \"acc_stderr\": 0.011366474352008826\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.620166793025019,\n \
\ \"acc_stderr\": 0.013368818096960498\n }\n}\n```"
repo_url: https://huggingface.co/FuseAI/FuseChat-7B-TA
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|arc:challenge|25_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|gsm8k|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hellaswag|10_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-09-42.780252.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T13-09-42.780252.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- '**/details_harness|winogrande|5_2024-03-10T13-09-42.780252.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T13-09-42.780252.parquet'
- config_name: results
data_files:
- split: 2024_03_10T13_09_42.780252
path:
- results_2024-03-10T13-09-42.780252.parquet
- split: latest
path:
- results_2024-03-10T13-09-42.780252.parquet
---
# Dataset Card for Evaluation run of FuseAI/FuseChat-7B-TA
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [FuseAI/FuseChat-7B-TA](https://huggingface.co/FuseAI/FuseChat-7B-TA) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_FuseAI__FuseChat-7B-TA",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T13:09:42.780252](https://huggingface.co/datasets/open-llm-leaderboard/details_FuseAI__FuseChat-7B-TA/blob/main/results_2024-03-10T13-09-42.780252.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.640970662790146,
"acc_stderr": 0.032240544947752886,
"acc_norm": 0.6427814675983042,
"acc_norm_stderr": 0.03289036996539776,
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323016,
"mc2": 0.45736495798430604,
"mc2_stderr": 0.015085741763833444
},
"harness|arc:challenge|25": {
"acc": 0.5793515358361775,
"acc_stderr": 0.014426211252508397,
"acc_norm": 0.6254266211604096,
"acc_norm_stderr": 0.014144193471893452
},
"harness|hellaswag|10": {
"acc": 0.6491734714200359,
"acc_stderr": 0.004762534245488402,
"acc_norm": 0.842162915753834,
"acc_norm_stderr": 0.0036384306206139272
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5851851851851851,
"acc_stderr": 0.04256193767901408,
"acc_norm": 0.5851851851851851,
"acc_norm_stderr": 0.04256193767901408
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.64,
"acc_stderr": 0.04824181513244218,
"acc_norm": 0.64,
"acc_norm_stderr": 0.04824181513244218
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6867924528301886,
"acc_stderr": 0.028544793319055326,
"acc_norm": 0.6867924528301886,
"acc_norm_stderr": 0.028544793319055326
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.75,
"acc_stderr": 0.03621034121889507,
"acc_norm": 0.75,
"acc_norm_stderr": 0.03621034121889507
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.43137254901960786,
"acc_stderr": 0.04928099597287534,
"acc_norm": 0.43137254901960786,
"acc_norm_stderr": 0.04928099597287534
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.74,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.74,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5702127659574469,
"acc_stderr": 0.03236214467715564,
"acc_norm": 0.5702127659574469,
"acc_norm_stderr": 0.03236214467715564
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.5350877192982456,
"acc_stderr": 0.046920083813689104,
"acc_norm": 0.5350877192982456,
"acc_norm_stderr": 0.046920083813689104
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5517241379310345,
"acc_stderr": 0.04144311810878151,
"acc_norm": 0.5517241379310345,
"acc_norm_stderr": 0.04144311810878151
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.02546714904546955,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.02546714904546955
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.5317460317460317,
"acc_stderr": 0.04463112720677172,
"acc_norm": 0.5317460317460317,
"acc_norm_stderr": 0.04463112720677172
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.28,
"acc_stderr": 0.045126085985421276,
"acc_norm": 0.28,
"acc_norm_stderr": 0.045126085985421276
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7903225806451613,
"acc_stderr": 0.023157879349083525,
"acc_norm": 0.7903225806451613,
"acc_norm_stderr": 0.023157879349083525
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.49261083743842365,
"acc_stderr": 0.035176035403610084,
"acc_norm": 0.49261083743842365,
"acc_norm_stderr": 0.035176035403610084
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7777777777777778,
"acc_stderr": 0.02962022787479048,
"acc_norm": 0.7777777777777778,
"acc_norm_stderr": 0.02962022787479048
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.917098445595855,
"acc_stderr": 0.01989934131572178,
"acc_norm": 0.917098445595855,
"acc_norm_stderr": 0.01989934131572178
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.658974358974359,
"acc_stderr": 0.02403548967633508,
"acc_norm": 0.658974358974359,
"acc_norm_stderr": 0.02403548967633508
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.3333333333333333,
"acc_stderr": 0.028742040903948492,
"acc_norm": 0.3333333333333333,
"acc_norm_stderr": 0.028742040903948492
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6638655462184874,
"acc_stderr": 0.030684737115135353,
"acc_norm": 0.6638655462184874,
"acc_norm_stderr": 0.030684737115135353
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.304635761589404,
"acc_stderr": 0.03757949922943343,
"acc_norm": 0.304635761589404,
"acc_norm_stderr": 0.03757949922943343
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8458715596330275,
"acc_stderr": 0.0154808268653743,
"acc_norm": 0.8458715596330275,
"acc_norm_stderr": 0.0154808268653743
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5370370370370371,
"acc_stderr": 0.03400603625538272,
"acc_norm": 0.5370370370370371,
"acc_norm_stderr": 0.03400603625538272
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7892156862745098,
"acc_stderr": 0.02862654791243742,
"acc_norm": 0.7892156862745098,
"acc_norm_stderr": 0.02862654791243742
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621115,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621115
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.7174887892376681,
"acc_stderr": 0.030216831011508762,
"acc_norm": 0.7174887892376681,
"acc_norm_stderr": 0.030216831011508762
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7557251908396947,
"acc_stderr": 0.037683359597287434,
"acc_norm": 0.7557251908396947,
"acc_norm_stderr": 0.037683359597287434
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7520661157024794,
"acc_stderr": 0.03941897526516303,
"acc_norm": 0.7520661157024794,
"acc_norm_stderr": 0.03941897526516303
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252627,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252627
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.039891398595317706,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.039891398595317706
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8675213675213675,
"acc_stderr": 0.022209309073165612,
"acc_norm": 0.8675213675213675,
"acc_norm_stderr": 0.022209309073165612
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8109833971902938,
"acc_stderr": 0.014000791294406999,
"acc_norm": 0.8109833971902938,
"acc_norm_stderr": 0.014000791294406999
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7254335260115607,
"acc_stderr": 0.02402774515526502,
"acc_norm": 0.7254335260115607,
"acc_norm_stderr": 0.02402774515526502
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4,
"acc_stderr": 0.01638463841038082,
"acc_norm": 0.4,
"acc_norm_stderr": 0.01638463841038082
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7320261437908496,
"acc_stderr": 0.025360603796242557,
"acc_norm": 0.7320261437908496,
"acc_norm_stderr": 0.025360603796242557
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7202572347266881,
"acc_stderr": 0.025494259350694912,
"acc_norm": 0.7202572347266881,
"acc_norm_stderr": 0.025494259350694912
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.024922001168886335,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.024922001168886335
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.46808510638297873,
"acc_stderr": 0.029766675075873862,
"acc_norm": 0.46808510638297873,
"acc_norm_stderr": 0.029766675075873862
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4784876140808344,
"acc_stderr": 0.012758410941038915,
"acc_norm": 0.4784876140808344,
"acc_norm_stderr": 0.012758410941038915
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6699346405228758,
"acc_stderr": 0.019023726160724553,
"acc_norm": 0.6699346405228758,
"acc_norm_stderr": 0.019023726160724553
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8308457711442786,
"acc_stderr": 0.02650859065623327,
"acc_norm": 0.8308457711442786,
"acc_norm_stderr": 0.02650859065623327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5180722891566265,
"acc_stderr": 0.03889951252827216,
"acc_norm": 0.5180722891566265,
"acc_norm_stderr": 0.03889951252827216
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.847953216374269,
"acc_stderr": 0.027539122889061456,
"acc_norm": 0.847953216374269,
"acc_norm_stderr": 0.027539122889061456
},
"harness|truthfulqa:mc|0": {
"mc1": 0.3072215422276622,
"mc1_stderr": 0.016150201321323016,
"mc2": 0.45736495798430604,
"mc2_stderr": 0.015085741763833444
},
"harness|winogrande|5": {
"acc": 0.7940015785319653,
"acc_stderr": 0.011366474352008826
},
"harness|gsm8k|5": {
"acc": 0.620166793025019,
"acc_stderr": 0.013368818096960498
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
NorGLM/NO-MRPC | NorGLM | "2024-10-01T18:37:41Z" | 0 | 0 | [
"language:no",
"license:cc-by-4.0",
"size_categories:1K<n<10K",
"format:json",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"arxiv:2312.01314",
"region:us"
] | null | "2024-03-10T13:15:48Z" | ---
license: cc-by-4.0
language:
- 'no'
---
## Dataset Card for NO-MRPC
The dataset is machine translated from [The Microsoft Research Paraphrase Corpus](https://www.microsoft.com/en-us/download/details.aspx?id=52398) which contains sentence pairs from English news sources with human anootations for thether the sentences in the pair are semantically equivalent.
We keep the original split of MRPC for NO-MRPC dataset. More information can be referred to [link](https://www.tensorflow.org/datasets/catalog/glue#gluemrpc).
This dataset belongs to NLEBench Norwegian benchmarks for evaluation on Norwegian Natrual Language Undersanding (NLU) tasks.
## Licensing Information
This dataset is built upon the existing datasets. We therefore follow its original license information.
## Citation Information
If you feel our work is helpful, please cite our paper:
```
@article{liu2023nlebench+,
title={NLEBench+ NorGLM: A Comprehensive Empirical Analysis and Benchmark Dataset for Generative Language Models in Norwegian},
author={Liu, Peng and Zhang, Lemei and Farup, Terje Nissen and Lauvrak, Even W and Ingvaldsen, Jon Espen and Eide, Simen and Gulla, Jon Atle and Yang, Zhirong},
journal={arXiv preprint arXiv:2312.01314},
year={2023}
}
```
|
Amanaccessassist/PotatoChips | Amanaccessassist | "2024-03-12T09:07:55Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:image",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T13:20:02Z" | ---
dataset_info:
features:
- name: image
dtype: image
- name: label
dtype:
class_label:
names:
'0': Defective
'1': Non-Defective
splits:
- name: train
num_bytes: 871755915.6420395
num_examples: 816
- name: test
num_bytes: 155260919.35796046
num_examples: 145
download_size: 1022733503
dataset_size: 1027016835.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
qqqqq1/Javaragas | qqqqq1 | "2024-03-10T13:27:20Z" | 0 | 0 | [
"language:zh",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T13:24:35Z" | ---
language:
- zh
pretty_name: java ragas
--- |
abideen/ultrachat-uncensored-10k | abideen | "2024-03-10T13:26:21Z" | 0 | 1 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T13:26:19Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: data
sequence: string
splits:
- name: train
num_bytes: 57325369
num_examples: 10000
download_size: 28939255
dataset_size: 57325369
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
open-llm-leaderboard-old/details_Evillain__StarDust_20B_v0.2 | open-llm-leaderboard-old | "2024-03-10T13:26:45Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T13:26:23Z" | ---
pretty_name: Evaluation run of Evillain/StarDust_20B_v0.2
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Evillain/StarDust_20B_v0.2](https://huggingface.co/Evillain/StarDust_20B_v0.2)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Evillain__StarDust_20B_v0.2\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T13:24:01.018769](https://huggingface.co/datasets/open-llm-leaderboard/details_Evillain__StarDust_20B_v0.2/blob/main/results_2024-03-10T13-24-01.018769.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.5908215683607256,\n\
\ \"acc_stderr\": 0.03303281253335369,\n \"acc_norm\": 0.5972987818649796,\n\
\ \"acc_norm_stderr\": 0.03373270810139984,\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5142552220243882,\n\
\ \"mc2_stderr\": 0.015971922407564575\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.5870307167235495,\n \"acc_stderr\": 0.014388344935398326,\n\
\ \"acc_norm\": 0.6100682593856656,\n \"acc_norm_stderr\": 0.014252959848892887\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6470822545309699,\n\
\ \"acc_stderr\": 0.004769007545082276,\n \"acc_norm\": 0.8375821549492133,\n\
\ \"acc_norm_stderr\": 0.003680798950531919\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.34,\n \"acc_stderr\": 0.047609522856952365,\n \
\ \"acc_norm\": 0.34,\n \"acc_norm_stderr\": 0.047609522856952365\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5111111111111111,\n\
\ \"acc_stderr\": 0.04318275491977976,\n \"acc_norm\": 0.5111111111111111,\n\
\ \"acc_norm_stderr\": 0.04318275491977976\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6447368421052632,\n \"acc_stderr\": 0.038947344870133176,\n\
\ \"acc_norm\": 0.6447368421052632,\n \"acc_norm_stderr\": 0.038947344870133176\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.61,\n\
\ \"acc_stderr\": 0.04902071300001974,\n \"acc_norm\": 0.61,\n \
\ \"acc_norm_stderr\": 0.04902071300001974\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6150943396226415,\n \"acc_stderr\": 0.02994649856769995,\n\
\ \"acc_norm\": 0.6150943396226415,\n \"acc_norm_stderr\": 0.02994649856769995\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7013888888888888,\n\
\ \"acc_stderr\": 0.03827052357950756,\n \"acc_norm\": 0.7013888888888888,\n\
\ \"acc_norm_stderr\": 0.03827052357950756\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.41,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.41,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.42,\n \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\"\
: 0.42,\n \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.5375722543352601,\n\
\ \"acc_stderr\": 0.03801685104524458,\n \"acc_norm\": 0.5375722543352601,\n\
\ \"acc_norm_stderr\": 0.03801685104524458\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.29411764705882354,\n \"acc_stderr\": 0.04533838195929776,\n\
\ \"acc_norm\": 0.29411764705882354,\n \"acc_norm_stderr\": 0.04533838195929776\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \"acc_norm\": 0.69,\n\
\ \"acc_norm_stderr\": 0.04648231987117316\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5276595744680851,\n \"acc_stderr\": 0.03263597118409769,\n\
\ \"acc_norm\": 0.5276595744680851,\n \"acc_norm_stderr\": 0.03263597118409769\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.04303684033537315,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.04303684033537315\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482758,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482758\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3492063492063492,\n \"acc_stderr\": 0.02455229220934265,\n \"\
acc_norm\": 0.3492063492063492,\n \"acc_norm_stderr\": 0.02455229220934265\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.35714285714285715,\n\
\ \"acc_stderr\": 0.04285714285714281,\n \"acc_norm\": 0.35714285714285715,\n\
\ \"acc_norm_stderr\": 0.04285714285714281\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.4,\n \"acc_stderr\": 0.04923659639173309,\n \
\ \"acc_norm\": 0.4,\n \"acc_norm_stderr\": 0.04923659639173309\n },\n\
\ \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.6967741935483871,\n\
\ \"acc_stderr\": 0.026148685930671746,\n \"acc_norm\": 0.6967741935483871,\n\
\ \"acc_norm_stderr\": 0.026148685930671746\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.4975369458128079,\n \"acc_stderr\": 0.03517945038691063,\n\
\ \"acc_norm\": 0.4975369458128079,\n \"acc_norm_stderr\": 0.03517945038691063\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.55,\n \"acc_stderr\": 0.049999999999999996,\n \"acc_norm\"\
: 0.55,\n \"acc_norm_stderr\": 0.049999999999999996\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7515151515151515,\n \"acc_stderr\": 0.03374402644139404,\n\
\ \"acc_norm\": 0.7515151515151515,\n \"acc_norm_stderr\": 0.03374402644139404\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7474747474747475,\n \"acc_stderr\": 0.030954055470365907,\n \"\
acc_norm\": 0.7474747474747475,\n \"acc_norm_stderr\": 0.030954055470365907\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8652849740932642,\n \"acc_stderr\": 0.024639789097709437,\n\
\ \"acc_norm\": 0.8652849740932642,\n \"acc_norm_stderr\": 0.024639789097709437\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6076923076923076,\n \"acc_stderr\": 0.02475600038213095,\n \
\ \"acc_norm\": 0.6076923076923076,\n \"acc_norm_stderr\": 0.02475600038213095\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34814814814814815,\n \"acc_stderr\": 0.029045600290616258,\n \
\ \"acc_norm\": 0.34814814814814815,\n \"acc_norm_stderr\": 0.029045600290616258\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6218487394957983,\n \"acc_stderr\": 0.031499305777849054,\n\
\ \"acc_norm\": 0.6218487394957983,\n \"acc_norm_stderr\": 0.031499305777849054\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.33774834437086093,\n \"acc_stderr\": 0.038615575462551684,\n \"\
acc_norm\": 0.33774834437086093,\n \"acc_norm_stderr\": 0.038615575462551684\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.7706422018348624,\n \"acc_stderr\": 0.018025349724618684,\n \"\
acc_norm\": 0.7706422018348624,\n \"acc_norm_stderr\": 0.018025349724618684\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.46296296296296297,\n \"acc_stderr\": 0.03400603625538271,\n \"\
acc_norm\": 0.46296296296296297,\n \"acc_norm_stderr\": 0.03400603625538271\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7794117647058824,\n \"acc_stderr\": 0.02910225438967408,\n \"\
acc_norm\": 0.7794117647058824,\n \"acc_norm_stderr\": 0.02910225438967408\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6860986547085202,\n\
\ \"acc_stderr\": 0.03114679648297246,\n \"acc_norm\": 0.6860986547085202,\n\
\ \"acc_norm_stderr\": 0.03114679648297246\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.6870229007633588,\n \"acc_stderr\": 0.04066962905677698,\n\
\ \"acc_norm\": 0.6870229007633588,\n \"acc_norm_stderr\": 0.04066962905677698\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7603305785123967,\n \"acc_stderr\": 0.03896878985070416,\n \"\
acc_norm\": 0.7603305785123967,\n \"acc_norm_stderr\": 0.03896878985070416\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7685185185185185,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.7685185185185185,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7484662576687117,\n \"acc_stderr\": 0.03408997886857529,\n\
\ \"acc_norm\": 0.7484662576687117,\n \"acc_norm_stderr\": 0.03408997886857529\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.3482142857142857,\n\
\ \"acc_stderr\": 0.04521829902833586,\n \"acc_norm\": 0.3482142857142857,\n\
\ \"acc_norm_stderr\": 0.04521829902833586\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7184466019417476,\n \"acc_stderr\": 0.04453254836326468,\n\
\ \"acc_norm\": 0.7184466019417476,\n \"acc_norm_stderr\": 0.04453254836326468\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8547008547008547,\n\
\ \"acc_stderr\": 0.023086635086841407,\n \"acc_norm\": 0.8547008547008547,\n\
\ \"acc_norm_stderr\": 0.023086635086841407\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.59,\n \"acc_stderr\": 0.049431107042371025,\n \
\ \"acc_norm\": 0.59,\n \"acc_norm_stderr\": 0.049431107042371025\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139956,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139956\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.6734104046242775,\n \"acc_stderr\": 0.025248264774242832,\n\
\ \"acc_norm\": 0.6734104046242775,\n \"acc_norm_stderr\": 0.025248264774242832\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3854748603351955,\n\
\ \"acc_stderr\": 0.01627792703963819,\n \"acc_norm\": 0.3854748603351955,\n\
\ \"acc_norm_stderr\": 0.01627792703963819\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.6470588235294118,\n \"acc_stderr\": 0.027363593284684965,\n\
\ \"acc_norm\": 0.6470588235294118,\n \"acc_norm_stderr\": 0.027363593284684965\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.662379421221865,\n\
\ \"acc_stderr\": 0.026858825879488544,\n \"acc_norm\": 0.662379421221865,\n\
\ \"acc_norm_stderr\": 0.026858825879488544\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7067901234567902,\n \"acc_stderr\": 0.025329888171900922,\n\
\ \"acc_norm\": 0.7067901234567902,\n \"acc_norm_stderr\": 0.025329888171900922\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4716312056737589,\n \"acc_stderr\": 0.029779450957303062,\n \
\ \"acc_norm\": 0.4716312056737589,\n \"acc_norm_stderr\": 0.029779450957303062\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.45632333767926986,\n\
\ \"acc_stderr\": 0.012721420501462546,\n \"acc_norm\": 0.45632333767926986,\n\
\ \"acc_norm_stderr\": 0.012721420501462546\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.5625,\n \"acc_stderr\": 0.030134614954403924,\n \
\ \"acc_norm\": 0.5625,\n \"acc_norm_stderr\": 0.030134614954403924\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6323529411764706,\n \"acc_stderr\": 0.01950629169395486,\n \
\ \"acc_norm\": 0.6323529411764706,\n \"acc_norm_stderr\": 0.01950629169395486\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6272727272727273,\n\
\ \"acc_stderr\": 0.04631381319425465,\n \"acc_norm\": 0.6272727272727273,\n\
\ \"acc_norm_stderr\": 0.04631381319425465\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7061224489795919,\n \"acc_stderr\": 0.029162738410249772,\n\
\ \"acc_norm\": 0.7061224489795919,\n \"acc_norm_stderr\": 0.029162738410249772\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.7661691542288557,\n\
\ \"acc_stderr\": 0.029929415408348384,\n \"acc_norm\": 0.7661691542288557,\n\
\ \"acc_norm_stderr\": 0.029929415408348384\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.88,\n \"acc_stderr\": 0.03265986323710906,\n \
\ \"acc_norm\": 0.88,\n \"acc_norm_stderr\": 0.03265986323710906\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.4939759036144578,\n\
\ \"acc_stderr\": 0.03892212195333045,\n \"acc_norm\": 0.4939759036144578,\n\
\ \"acc_norm_stderr\": 0.03892212195333045\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.7719298245614035,\n \"acc_stderr\": 0.032180937956023566,\n\
\ \"acc_norm\": 0.7719298245614035,\n \"acc_norm_stderr\": 0.032180937956023566\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.37209302325581395,\n\
\ \"mc1_stderr\": 0.016921090118814035,\n \"mc2\": 0.5142552220243882,\n\
\ \"mc2_stderr\": 0.015971922407564575\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7726913970007893,\n \"acc_stderr\": 0.011778612167091088\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.2403335860500379,\n \
\ \"acc_stderr\": 0.011769580703836945\n }\n}\n```"
repo_url: https://huggingface.co/Evillain/StarDust_20B_v0.2
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|arc:challenge|25_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|gsm8k|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hellaswag|10_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-24-01.018769.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T13-24-01.018769.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- '**/details_harness|winogrande|5_2024-03-10T13-24-01.018769.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T13-24-01.018769.parquet'
- config_name: results
data_files:
- split: 2024_03_10T13_24_01.018769
path:
- results_2024-03-10T13-24-01.018769.parquet
- split: latest
path:
- results_2024-03-10T13-24-01.018769.parquet
---
# Dataset Card for Evaluation run of Evillain/StarDust_20B_v0.2
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Evillain/StarDust_20B_v0.2](https://huggingface.co/Evillain/StarDust_20B_v0.2) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Evillain__StarDust_20B_v0.2",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T13:24:01.018769](https://huggingface.co/datasets/open-llm-leaderboard/details_Evillain__StarDust_20B_v0.2/blob/main/results_2024-03-10T13-24-01.018769.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.5908215683607256,
"acc_stderr": 0.03303281253335369,
"acc_norm": 0.5972987818649796,
"acc_norm_stderr": 0.03373270810139984,
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5142552220243882,
"mc2_stderr": 0.015971922407564575
},
"harness|arc:challenge|25": {
"acc": 0.5870307167235495,
"acc_stderr": 0.014388344935398326,
"acc_norm": 0.6100682593856656,
"acc_norm_stderr": 0.014252959848892887
},
"harness|hellaswag|10": {
"acc": 0.6470822545309699,
"acc_stderr": 0.004769007545082276,
"acc_norm": 0.8375821549492133,
"acc_norm_stderr": 0.003680798950531919
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.34,
"acc_stderr": 0.047609522856952365,
"acc_norm": 0.34,
"acc_norm_stderr": 0.047609522856952365
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5111111111111111,
"acc_stderr": 0.04318275491977976,
"acc_norm": 0.5111111111111111,
"acc_norm_stderr": 0.04318275491977976
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6447368421052632,
"acc_stderr": 0.038947344870133176,
"acc_norm": 0.6447368421052632,
"acc_norm_stderr": 0.038947344870133176
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.61,
"acc_stderr": 0.04902071300001974,
"acc_norm": 0.61,
"acc_norm_stderr": 0.04902071300001974
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6150943396226415,
"acc_stderr": 0.02994649856769995,
"acc_norm": 0.6150943396226415,
"acc_norm_stderr": 0.02994649856769995
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7013888888888888,
"acc_stderr": 0.03827052357950756,
"acc_norm": 0.7013888888888888,
"acc_norm_stderr": 0.03827052357950756
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.41,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.41,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.42,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.42,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.5375722543352601,
"acc_stderr": 0.03801685104524458,
"acc_norm": 0.5375722543352601,
"acc_norm_stderr": 0.03801685104524458
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.29411764705882354,
"acc_stderr": 0.04533838195929776,
"acc_norm": 0.29411764705882354,
"acc_norm_stderr": 0.04533838195929776
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5276595744680851,
"acc_stderr": 0.03263597118409769,
"acc_norm": 0.5276595744680851,
"acc_norm_stderr": 0.03263597118409769
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.04303684033537315,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.04303684033537315
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482758,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482758
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3492063492063492,
"acc_stderr": 0.02455229220934265,
"acc_norm": 0.3492063492063492,
"acc_norm_stderr": 0.02455229220934265
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.35714285714285715,
"acc_stderr": 0.04285714285714281,
"acc_norm": 0.35714285714285715,
"acc_norm_stderr": 0.04285714285714281
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.4,
"acc_stderr": 0.04923659639173309,
"acc_norm": 0.4,
"acc_norm_stderr": 0.04923659639173309
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.6967741935483871,
"acc_stderr": 0.026148685930671746,
"acc_norm": 0.6967741935483871,
"acc_norm_stderr": 0.026148685930671746
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4975369458128079,
"acc_stderr": 0.03517945038691063,
"acc_norm": 0.4975369458128079,
"acc_norm_stderr": 0.03517945038691063
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.55,
"acc_stderr": 0.049999999999999996,
"acc_norm": 0.55,
"acc_norm_stderr": 0.049999999999999996
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7515151515151515,
"acc_stderr": 0.03374402644139404,
"acc_norm": 0.7515151515151515,
"acc_norm_stderr": 0.03374402644139404
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7474747474747475,
"acc_stderr": 0.030954055470365907,
"acc_norm": 0.7474747474747475,
"acc_norm_stderr": 0.030954055470365907
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8652849740932642,
"acc_stderr": 0.024639789097709437,
"acc_norm": 0.8652849740932642,
"acc_norm_stderr": 0.024639789097709437
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6076923076923076,
"acc_stderr": 0.02475600038213095,
"acc_norm": 0.6076923076923076,
"acc_norm_stderr": 0.02475600038213095
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34814814814814815,
"acc_stderr": 0.029045600290616258,
"acc_norm": 0.34814814814814815,
"acc_norm_stderr": 0.029045600290616258
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6218487394957983,
"acc_stderr": 0.031499305777849054,
"acc_norm": 0.6218487394957983,
"acc_norm_stderr": 0.031499305777849054
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.33774834437086093,
"acc_stderr": 0.038615575462551684,
"acc_norm": 0.33774834437086093,
"acc_norm_stderr": 0.038615575462551684
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.7706422018348624,
"acc_stderr": 0.018025349724618684,
"acc_norm": 0.7706422018348624,
"acc_norm_stderr": 0.018025349724618684
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.46296296296296297,
"acc_stderr": 0.03400603625538271,
"acc_norm": 0.46296296296296297,
"acc_norm_stderr": 0.03400603625538271
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7794117647058824,
"acc_stderr": 0.02910225438967408,
"acc_norm": 0.7794117647058824,
"acc_norm_stderr": 0.02910225438967408
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6860986547085202,
"acc_stderr": 0.03114679648297246,
"acc_norm": 0.6860986547085202,
"acc_norm_stderr": 0.03114679648297246
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.6870229007633588,
"acc_stderr": 0.04066962905677698,
"acc_norm": 0.6870229007633588,
"acc_norm_stderr": 0.04066962905677698
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7603305785123967,
"acc_stderr": 0.03896878985070416,
"acc_norm": 0.7603305785123967,
"acc_norm_stderr": 0.03896878985070416
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7685185185185185,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.7685185185185185,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7484662576687117,
"acc_stderr": 0.03408997886857529,
"acc_norm": 0.7484662576687117,
"acc_norm_stderr": 0.03408997886857529
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.3482142857142857,
"acc_stderr": 0.04521829902833586,
"acc_norm": 0.3482142857142857,
"acc_norm_stderr": 0.04521829902833586
},
"harness|hendrycksTest-management|5": {
"acc": 0.7184466019417476,
"acc_stderr": 0.04453254836326468,
"acc_norm": 0.7184466019417476,
"acc_norm_stderr": 0.04453254836326468
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8547008547008547,
"acc_stderr": 0.023086635086841407,
"acc_norm": 0.8547008547008547,
"acc_norm_stderr": 0.023086635086841407
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139956,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139956
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.6734104046242775,
"acc_stderr": 0.025248264774242832,
"acc_norm": 0.6734104046242775,
"acc_norm_stderr": 0.025248264774242832
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3854748603351955,
"acc_stderr": 0.01627792703963819,
"acc_norm": 0.3854748603351955,
"acc_norm_stderr": 0.01627792703963819
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.6470588235294118,
"acc_stderr": 0.027363593284684965,
"acc_norm": 0.6470588235294118,
"acc_norm_stderr": 0.027363593284684965
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.662379421221865,
"acc_stderr": 0.026858825879488544,
"acc_norm": 0.662379421221865,
"acc_norm_stderr": 0.026858825879488544
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7067901234567902,
"acc_stderr": 0.025329888171900922,
"acc_norm": 0.7067901234567902,
"acc_norm_stderr": 0.025329888171900922
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4716312056737589,
"acc_stderr": 0.029779450957303062,
"acc_norm": 0.4716312056737589,
"acc_norm_stderr": 0.029779450957303062
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.45632333767926986,
"acc_stderr": 0.012721420501462546,
"acc_norm": 0.45632333767926986,
"acc_norm_stderr": 0.012721420501462546
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.5625,
"acc_stderr": 0.030134614954403924,
"acc_norm": 0.5625,
"acc_norm_stderr": 0.030134614954403924
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6323529411764706,
"acc_stderr": 0.01950629169395486,
"acc_norm": 0.6323529411764706,
"acc_norm_stderr": 0.01950629169395486
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6272727272727273,
"acc_stderr": 0.04631381319425465,
"acc_norm": 0.6272727272727273,
"acc_norm_stderr": 0.04631381319425465
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7061224489795919,
"acc_stderr": 0.029162738410249772,
"acc_norm": 0.7061224489795919,
"acc_norm_stderr": 0.029162738410249772
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.7661691542288557,
"acc_stderr": 0.029929415408348384,
"acc_norm": 0.7661691542288557,
"acc_norm_stderr": 0.029929415408348384
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.88,
"acc_stderr": 0.03265986323710906,
"acc_norm": 0.88,
"acc_norm_stderr": 0.03265986323710906
},
"harness|hendrycksTest-virology|5": {
"acc": 0.4939759036144578,
"acc_stderr": 0.03892212195333045,
"acc_norm": 0.4939759036144578,
"acc_norm_stderr": 0.03892212195333045
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.7719298245614035,
"acc_stderr": 0.032180937956023566,
"acc_norm": 0.7719298245614035,
"acc_norm_stderr": 0.032180937956023566
},
"harness|truthfulqa:mc|0": {
"mc1": 0.37209302325581395,
"mc1_stderr": 0.016921090118814035,
"mc2": 0.5142552220243882,
"mc2_stderr": 0.015971922407564575
},
"harness|winogrande|5": {
"acc": 0.7726913970007893,
"acc_stderr": 0.011778612167091088
},
"harness|gsm8k|5": {
"acc": 0.2403335860500379,
"acc_stderr": 0.011769580703836945
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ErasmoMestreDasVozez/audio | ErasmoMestreDasVozez | "2024-03-10T13:41:47Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T13:39:05Z" | ---
license: openrail
---
|
rumeysacelik/turkishReviews-ds-commerce | rumeysacelik | "2024-03-10T13:54:19Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T13:54:14Z" | ---
dataset_info:
features:
- name: review
dtype: string
- name: review_length
dtype: int64
splits:
- name: train
num_bytes: 1252876.2642514652
num_examples: 3378
- name: validation
num_bytes: 139455.7357485349
num_examples: 376
download_size: 896651
dataset_size: 1392332.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
NorGLM/NO-ConvAI2 | NorGLM | "2024-10-01T18:31:21Z" | 0 | 0 | [
"language:no",
"license:unknown",
"size_categories:100K<n<1M",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"arxiv:2312.01314",
"region:us"
] | null | "2024-03-10T13:57:40Z" | ---
license: unknown
language:
- 'no'
---
## Dataset Card for NO-ConvAI2
NO-ConvAI2 is an open-domain human-to-bot conversational dataset machine translated from [ConvAI2](https://parl.ai/projects/convai2/).
In this dataset, the dialog_ids and turn_ids are omitted. Each line in the text are written into *Bot | Human* format.
## Data Split
The dataset is split into train and test sets.
| | #conversation_pairs |
|-------|---------------------|
| train | 253937 |
| test | 28658 |
More information on this dataset, please refer to [link](https://parl.ai/projects/convai2/).
## Licensing Information
This dataset is built upon the existing datasets. We therefore follow its original license information.
## Citation Information
If you feel our work is helpful, please cite our paper:
```
@article{liu2023nlebench+,
title={NLEBench+ NorGLM: A Comprehensive Empirical Analysis and Benchmark Dataset for Generative Language Models in Norwegian},
author={Liu, Peng and Zhang, Lemei and Farup, Terje Nissen and Lauvrak, Even W and Ingvaldsen, Jon Espen and Eide, Simen and Gulla, Jon Atle and Yang, Zhirong},
journal={arXiv preprint arXiv:2312.01314},
year={2023}
}
``` |
its5Q/otvetmailru | its5Q | "2024-03-10T19:29:35Z" | 0 | 0 | [
"task_categories:question-answering",
"language:ru",
"license:cc0-1.0",
"size_categories:100M<n<1B",
"region:us"
] | [
"question-answering"
] | "2024-03-10T14:17:12Z" | ---
license: cc0-1.0
task_categories:
- question-answering
language:
- ru
pretty_name: otvet.mail.ru questions
size_categories:
- 100M<n<1B
---
# Dataset Card for otvet.mail.ru questions
## Table of Contents
- [Table of Contents](#table-of-contents)
- [Dataset Description](#dataset-description)
- [Dataset Summary](#dataset-summary)
- [Languages](#languages)
- [Dataset Structure](#dataset-structure)
- [Dataset Creation](#dataset-creation)
- [Additional Information](#additional-information)
- [Dataset Curators](#dataset-curators)
## Dataset Description
### Dataset Summary
This is a dataset of questions and answers scraped from [otvet.mail.ru](https://otvet.mail.ru/). There are about 130 million questions with all their corresponding metadata that were posted before 03/05/2022 (the date the dataset was collected). This is a reupload of my dataset on [Kaggle](https://www.kaggle.com/datasets/atleast6characterss/otvetmailru-full)
### Languages
The dataset is mostly in Russian, but there may be other languages present.
## Dataset Structure
~~Please refer to the Dataset Viewer for more information on the dataset structure.~~
For now the Dataset Viewer doesn't work because of inconsistent data type across samples. I'll try to fix it later, but for now, the dataset could be used by downloading ZSTD compressed chunks, each consisting of 2_500_000 samples.
## Dataset Creation
The data was scraped using AJAX endpoints that return full question and answers metadata by id that is auto-incremented.
## Additional Information
### Dataset Curators
- https://github.com/its5Q |
Alphonse-96/Eng_luo_10K | Alphonse-96 | "2024-03-10T14:29:18Z" | 0 | 0 | [
"license:mit",
"size_categories:10K<n<100K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T14:21:44Z" | ---
license: mit
---
|
Henry65/RepoSnipy_dataset | Henry65 | "2024-04-09T12:49:14Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T14:21:55Z" | ---
license: mit
---
|
dappyx/QazSyntQAD | dappyx | "2024-04-12T14:30:13Z" | 0 | 0 | [
"task_categories:question-answering",
"language:kk",
"size_categories:1K<n<10K",
"format:csv",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"question-answering"
] | "2024-03-10T14:23:55Z" | ---
task_categories:
- question-answering
language:
- kk
size_categories:
- n<1K
---
<h1>Qazaq Syntetic Question Answering Dataset (QazSyntQAD)</h1>
<h3>Model Description</h3>
Dataset created using wikipedia and wikibook data passed through Claude-3-Sonnet-20240229
<br>
<h3>Model Author</h3>
This dataset created by Adil Rakhimzhanov |
Straive-Kripa/sp500_sga | Straive-Kripa | "2024-03-10T14:33:20Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:csv",
"modality:tabular",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T14:27:53Z" | ---
license: apache-2.0
---
https://www.kaggle.com/datasets/pierrelouisdanieau/financial-data-sp500-companies |
Lonewolf2441139/gcdata1 | Lonewolf2441139 | "2024-03-10T14:35:27Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T14:34:25Z" | ---
license: apache-2.0
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4730839
num_examples: 3044
download_size: 1680838
dataset_size: 4730839
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
imgmongelli/testdataframe | imgmongelli | "2024-03-10T14:36:56Z" | 0 | 0 | [
"license:mit",
"size_categories:n<1K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T14:35:32Z" | ---
license: mit
---
|
valdineiarcenio/galvaobueno1 | valdineiarcenio | "2024-03-10T14:38:20Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T14:35:55Z" | ---
license: openrail
---
|
Sayankotor/wikipedia_acticles_embs | Sayankotor | "2024-03-10T15:28:32Z" | 0 | 0 | [
"language:en",
"license:apache-2.0",
"size_categories:10M<n<100M",
"region:us"
] | null | "2024-03-10T14:43:28Z" | ---
license: apache-2.0
language:
- en
size_categories:
- 10M<n<100M
--- |
mikecho/isom5240grp20 | mikecho | "2024-03-21T16:35:49Z" | 0 | 0 | [
"license:apache-2.0",
"size_categories:n<1K",
"format:imagefolder",
"modality:image",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T14:49:00Z" | ---
license: apache-2.0
---
|
fathyshalaby/dds | fathyshalaby | "2024-03-10T14:49:09Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T14:49:06Z" | ---
dataset_info:
features:
- name: user-message
dtype: string
id: field
- name: question-rating
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: int32
id: suggestion
- name: status
dtype: string
id: question
- name: question-rating-suggestion
dtype: int32
id: suggestion
- name: question-rating-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: response
list:
- name: user_id
dtype: string
id: question
- name: value
dtype: string
id: suggestion
- name: status
dtype: string
id: question
- name: response-suggestion
dtype: string
id: suggestion
- name: response-suggestion-metadata
struct:
- name: type
dtype: string
id: suggestion-metadata
- name: score
dtype: float32
id: suggestion-metadata
- name: agent
dtype: string
id: suggestion-metadata
- name: external_id
dtype: string
id: external_id
- name: metadata
dtype: string
id: metadata
splits:
- name: train
num_bytes: 102128
num_examples: 38
download_size: 88081
dataset_size: 102128
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
Straive-Kripa/scrubbed_emails | Straive-Kripa | "2024-03-10T14:59:06Z" | 0 | 0 | [
"license:afl-3.0",
"size_categories:n<1K",
"format:text",
"modality:text",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T14:58:33Z" | ---
license: afl-3.0
---
|
Arnaldo34/Myvoice4 | Arnaldo34 | "2024-03-10T15:03:50Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T15:02:56Z" | ---
license: openrail
---
|
Arnaldo34/Minhavoz4 | Arnaldo34 | "2024-03-10T15:10:54Z" | 0 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-03-10T15:10:29Z" | ---
license: openrail
---
|
Kodezinh/Myvoice4 | Kodezinh | "2024-03-10T15:21:07Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T15:15:48Z" | ---
license: openrail
---
|
NorGLM/NO-Alpaca-Plus | NorGLM | "2024-10-01T18:35:51Z" | 0 | 0 | [
"language:no",
"license:cc-by-nc-sa-4.0",
"arxiv:2312.01314",
"region:us"
] | null | "2024-03-10T15:26:26Z" | ---
license: cc-by-nc-sa-4.0
language:
- 'no'
---
# Dataset Card
## Dataset Summary
NO-Alpaca-Plus includes two parts: NO-Alpaca is from [NB Alpaca Norwegian Bokmål](https://huggingface.co/datasets/NbAiLab/norwegian-alpaca), a machine translated Norwegian Bokmål instruction dataset originated from English [Alpaca](https://huggingface.co/datasets/tatsu-lab/alpaca) Instruction dataset. And several human annotated Norwegian Instruction dataset pertaining to Norwegian culture and special expressions.
## Language
The data in NO-Alpaca-Plus are in Norwegian Bokmål.
## Data Instance
The data instances in *instruction_culture.numbers* and *instruction_special_expression.numbers* including **instruction**, **input** and **output**. An example is as follows:
```
{
instruction: Baser på følgende tekst, hvordan synes personen det er å jobbe i Norge?
input: Jeg trives stort med å jobbe i Norge. Mange vil kanskje mene at været er dårlig, nordmenn er kalde og livskvaliteten deretter er dårlig, selv om man tjener bra. Jeg tror imidlertid dette er å male fanden på veggen. Nordmenn er ulike og været varier, noe som for meg gjør livet mer innholdsrikt! I tillegg er det stort fokus på fritid, slik at man ikke møter veggen og blir utbrent av å jobbe for mye.
output: Personen trives godt med å jobbe i Norge, spesielt trekker personen fram balansen mellom jobb og fritid. Likevel viser personen forståelse for at alt ikke er perfekt.
}
```
## Data Split
NO-Alpaca is based on 80:20 split for fine-tuning and evaluating NorGLMs.
Data in *instruction_culture.csv* and *instruction_special_expression.csv* are used for case study to test the ability of language models in understnading Norwegian cultures. *instruction_fine_tune.csv* includes all human annotated instruction samples.
| | #samples |
|-------|---------------------|
| instruction_culture | 37 |
| instruction_special_expression | 65 |
| instruction_fine_tune | 102 |
## Licensing Information
NO-Alpaca is following its original license in [link](https://huggingface.co/datasets/NbAiLab/norwegian-alpaca).
Our human-annotated data are following cc-by-nc-sa-4.0 license.
## Citation Information
If you feel our work is helpful, please cite our paper:
```
@article{liu2023nlebench+,
title={NLEBench+ NorGLM: A Comprehensive Empirical Analysis and Benchmark Dataset for Generative Language Models in Norwegian},
author={Liu, Peng and Zhang, Lemei and Farup, Terje Nissen and Lauvrak, Even W and Ingvaldsen, Jon Espen and Eide, Simen and Gulla, Jon Atle and Yang, Zhirong},
journal={arXiv preprint arXiv:2312.01314},
year={2023}
}
``` |
mextre/frieren | mextre | "2024-03-12T14:45:22Z" | 0 | 0 | [
"license:unknown",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T15:39:53Z" | ---
license: unknown
---
|
Zutomama/acane | Zutomama | "2024-03-10T15:58:35Z" | 0 | 0 | [
"license:unknown",
"region:us"
] | null | "2024-03-10T15:43:47Z" | ---
license: unknown
---
|
shidowake/nu-dialogue_jmultiwoz_with_custom_sys_prompt_fixed2 | shidowake | "2024-03-11T03:06:26Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T15:46:35Z" | ---
dataset_info:
features:
- name: dialogue_id
dtype: int64
- name: goal_description
struct:
- name: attraction
sequence: string
- name: general
sequence: string
- name: hotel
sequence: string
- name: restaurant
sequence: string
- name: shopping
sequence: string
- name: taxi
sequence: string
- name: weather
sequence: string
- name: conversations
list:
- name: content
dtype: string
- name: role
dtype: string
- name: goal_description_array
sequence: string
- name: goal_description_concat
dtype: string
- name: system_input
dtype: string
- name: conversations_without_system_prompt
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train
num_bytes: 34534924
num_examples: 4246
download_size: 6892865
dataset_size: 34534924
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
# Description
Slightly modified and formatted version of the original datasets for my own purpose.
# Original Dataset
- nu-dialogue/jmultiwoz
The JMultiWOZ dataset is licensed under a Creative Commons Attribution-NoDerivatives 4.0 International License.
- [nu-dialogue/jmultiwoz · Datasets at Hugging Face](https://huggingface.co/datasets/nu-dialogue/jmultiwoz)
- [nu-dialogue/jmultiwoz: JMultiWOZ: A Large-Scale Japanese Multi-Domain Task-Oriented Dialogue Dataset](https://github.com/nu-dialogue/jmultiwoz)
# License
CC BY-ND 4.0 DEED
- [CC BY-ND 4.0 Deed | Attribution-NoDerivs 4.0 International | Creative Commons](https://creativecommons.org/licenses/by-nd/4.0/)
|
presencesw/phomt_eval_0_20 | presencesw | "2024-03-10T18:00:00Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T15:47:30Z" | ---
dataset_info:
features:
- name: en
dtype: string
- name: vi
dtype: string
splits:
- name: validation
num_bytes: 1166237.2348950265
num_examples: 6460
- name: test
num_bytes: 1146201.5113571093
num_examples: 5978
download_size: 567582
dataset_size: 2312438.746252136
configs:
- config_name: default
data_files:
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
LenguajeNaturalAI/wnli_testing | LenguajeNaturalAI | "2024-03-12T20:37:07Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T15:49:19Z" | ---
dataset_info:
features:
- name: sentence1
dtype: string
- name: sentence2
dtype: string
- name: label
dtype:
class_label:
names:
'0': not_entailment
'1': entailment
splits:
- name: train
num_bytes: 22279.0
num_examples: 127
- name: validation
num_bytes: 22279.0
num_examples: 127
- name: test
num_bytes: 22279.0
num_examples: 127
download_size: 46581
dataset_size: 66837.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_OEvortex__HelpingAI-110M | open-llm-leaderboard-old | "2024-03-10T15:51:45Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T15:51:25Z" | ---
pretty_name: Evaluation run of OEvortex/HelpingAI-110M
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [OEvortex/HelpingAI-110M](https://huggingface.co/OEvortex/HelpingAI-110M) on the\
\ [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_OEvortex__HelpingAI-110M\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T15:49:59.362653](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-110M/blob/main/results_2024-03-10T15-49-59.362653.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.23731668171739753,\n\
\ \"acc_stderr\": 0.03006408174110251,\n \"acc_norm\": 0.23714256037066517,\n\
\ \"acc_norm_stderr\": 0.03085442849124215,\n \"mc1\": 0.24724602203182375,\n\
\ \"mc1_stderr\": 0.015102404797359654,\n \"mc2\": 0.4825112031965409,\n\
\ \"mc2_stderr\": 0.01620968936835715\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.2022184300341297,\n \"acc_stderr\": 0.011737454431872105,\n\
\ \"acc_norm\": 0.22781569965870307,\n \"acc_norm_stderr\": 0.012256708602326919\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.27454690300736906,\n\
\ \"acc_stderr\": 0.004453735900947837,\n \"acc_norm\": 0.2802230631348337,\n\
\ \"acc_norm_stderr\": 0.0044819026375056675\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.22,\n \"acc_stderr\": 0.04163331998932268,\n \
\ \"acc_norm\": 0.22,\n \"acc_norm_stderr\": 0.04163331998932268\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.17037037037037037,\n\
\ \"acc_stderr\": 0.032477811859955935,\n \"acc_norm\": 0.17037037037037037,\n\
\ \"acc_norm_stderr\": 0.032477811859955935\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.17763157894736842,\n \"acc_stderr\": 0.031103182383123398,\n\
\ \"acc_norm\": 0.17763157894736842,\n \"acc_norm_stderr\": 0.031103182383123398\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768078,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768078\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.19622641509433963,\n \"acc_stderr\": 0.024442388131100844,\n\
\ \"acc_norm\": 0.19622641509433963,\n \"acc_norm_stderr\": 0.024442388131100844\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.20833333333333334,\n\
\ \"acc_stderr\": 0.033961162058453336,\n \"acc_norm\": 0.20833333333333334,\n\
\ \"acc_norm_stderr\": 0.033961162058453336\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.042295258468165085,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.042295258468165085\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\"\
: 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.23,\n \"acc_stderr\": 0.04229525846816505,\n \
\ \"acc_norm\": 0.23,\n \"acc_norm_stderr\": 0.04229525846816505\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.20809248554913296,\n\
\ \"acc_stderr\": 0.030952890217749874,\n \"acc_norm\": 0.20809248554913296,\n\
\ \"acc_norm_stderr\": 0.030952890217749874\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.18627450980392157,\n \"acc_stderr\": 0.03873958714149353,\n\
\ \"acc_norm\": 0.18627450980392157,\n \"acc_norm_stderr\": 0.03873958714149353\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.18,\n \"acc_stderr\": 0.03861229196653697,\n \"acc_norm\": 0.18,\n\
\ \"acc_norm_stderr\": 0.03861229196653697\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.2680851063829787,\n \"acc_stderr\": 0.02895734278834235,\n\
\ \"acc_norm\": 0.2680851063829787,\n \"acc_norm_stderr\": 0.02895734278834235\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.24561403508771928,\n\
\ \"acc_stderr\": 0.0404933929774814,\n \"acc_norm\": 0.24561403508771928,\n\
\ \"acc_norm_stderr\": 0.0404933929774814\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.2827586206896552,\n \"acc_stderr\": 0.03752833958003337,\n\
\ \"acc_norm\": 0.2827586206896552,\n \"acc_norm_stderr\": 0.03752833958003337\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918407,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918407\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.037184890068181146,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.037184890068181146\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.18,\n \"acc_stderr\": 0.038612291966536934,\n \
\ \"acc_norm\": 0.18,\n \"acc_norm_stderr\": 0.038612291966536934\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.3,\n \"acc_stderr\": 0.026069362295335137,\n \"acc_norm\": 0.3,\n\
\ \"acc_norm_stderr\": 0.026069362295335137\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.22167487684729065,\n \"acc_stderr\": 0.0292255758924896,\n\
\ \"acc_norm\": 0.22167487684729065,\n \"acc_norm_stderr\": 0.0292255758924896\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.2,\n \"acc_stderr\": 0.04020151261036846,\n \"acc_norm\"\
: 0.2,\n \"acc_norm_stderr\": 0.04020151261036846\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.21818181818181817,\n \"acc_stderr\": 0.03225078108306289,\n\
\ \"acc_norm\": 0.21818181818181817,\n \"acc_norm_stderr\": 0.03225078108306289\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21212121212121213,\n \"acc_stderr\": 0.029126522834586825,\n \"\
acc_norm\": 0.21212121212121213,\n \"acc_norm_stderr\": 0.029126522834586825\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.19689119170984457,\n \"acc_stderr\": 0.028697873971860674,\n\
\ \"acc_norm\": 0.19689119170984457,\n \"acc_norm_stderr\": 0.028697873971860674\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.2128205128205128,\n \"acc_stderr\": 0.020752423722128002,\n\
\ \"acc_norm\": 0.2128205128205128,\n \"acc_norm_stderr\": 0.020752423722128002\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.25925925925925924,\n \"acc_stderr\": 0.026719240783712163,\n \
\ \"acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.026719240783712163\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.20168067226890757,\n \"acc_stderr\": 0.026064313406304523,\n\
\ \"acc_norm\": 0.20168067226890757,\n \"acc_norm_stderr\": 0.026064313406304523\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.24503311258278146,\n \"acc_stderr\": 0.035118075718047245,\n \"\
acc_norm\": 0.24503311258278146,\n \"acc_norm_stderr\": 0.035118075718047245\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.1908256880733945,\n \"acc_stderr\": 0.016847676400091112,\n \"\
acc_norm\": 0.1908256880733945,\n \"acc_norm_stderr\": 0.016847676400091112\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.39814814814814814,\n \"acc_stderr\": 0.033384734032074016,\n \"\
acc_norm\": 0.39814814814814814,\n \"acc_norm_stderr\": 0.033384734032074016\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25,\n \"acc_stderr\": 0.03039153369274154,\n \"acc_norm\": 0.25,\n\
\ \"acc_norm_stderr\": 0.03039153369274154\n },\n \"harness|hendrycksTest-high_school_world_history|5\"\
: {\n \"acc\": 0.2742616033755274,\n \"acc_stderr\": 0.02904133351059804,\n\
\ \"acc_norm\": 0.2742616033755274,\n \"acc_norm_stderr\": 0.02904133351059804\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3183856502242152,\n\
\ \"acc_stderr\": 0.03126580522513713,\n \"acc_norm\": 0.3183856502242152,\n\
\ \"acc_norm_stderr\": 0.03126580522513713\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.2595419847328244,\n \"acc_stderr\": 0.03844876139785271,\n\
\ \"acc_norm\": 0.2595419847328244,\n \"acc_norm_stderr\": 0.03844876139785271\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.03941897526516302,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.03941897526516302\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.2222222222222222,\n\
\ \"acc_stderr\": 0.040191074725573483,\n \"acc_norm\": 0.2222222222222222,\n\
\ \"acc_norm_stderr\": 0.040191074725573483\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.24539877300613497,\n \"acc_stderr\": 0.03380939813943354,\n\
\ \"acc_norm\": 0.24539877300613497,\n \"acc_norm_stderr\": 0.03380939813943354\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.2767857142857143,\n\
\ \"acc_stderr\": 0.042466243366976256,\n \"acc_norm\": 0.2767857142857143,\n\
\ \"acc_norm_stderr\": 0.042466243366976256\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.17475728155339806,\n \"acc_stderr\": 0.037601780060266224,\n\
\ \"acc_norm\": 0.17475728155339806,\n \"acc_norm_stderr\": 0.037601780060266224\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.19230769230769232,\n\
\ \"acc_stderr\": 0.025819233256483717,\n \"acc_norm\": 0.19230769230769232,\n\
\ \"acc_norm_stderr\": 0.025819233256483717\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2388250319284802,\n\
\ \"acc_stderr\": 0.015246803197398687,\n \"acc_norm\": 0.2388250319284802,\n\
\ \"acc_norm_stderr\": 0.015246803197398687\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.24566473988439305,\n \"acc_stderr\": 0.02317629820399201,\n\
\ \"acc_norm\": 0.24566473988439305,\n \"acc_norm_stderr\": 0.02317629820399201\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.23798882681564246,\n\
\ \"acc_stderr\": 0.014242630070574915,\n \"acc_norm\": 0.23798882681564246,\n\
\ \"acc_norm_stderr\": 0.014242630070574915\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.20915032679738563,\n \"acc_stderr\": 0.023287685312334806,\n\
\ \"acc_norm\": 0.20915032679738563,\n \"acc_norm_stderr\": 0.023287685312334806\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.1832797427652733,\n\
\ \"acc_stderr\": 0.02197419884826581,\n \"acc_norm\": 0.1832797427652733,\n\
\ \"acc_norm_stderr\": 0.02197419884826581\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.21604938271604937,\n \"acc_stderr\": 0.022899162918445806,\n\
\ \"acc_norm\": 0.21604938271604937,\n \"acc_norm_stderr\": 0.022899162918445806\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.22340425531914893,\n \"acc_stderr\": 0.02484792135806396,\n \
\ \"acc_norm\": 0.22340425531914893,\n \"acc_norm_stderr\": 0.02484792135806396\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.242503259452412,\n\
\ \"acc_stderr\": 0.010946570966348787,\n \"acc_norm\": 0.242503259452412,\n\
\ \"acc_norm_stderr\": 0.010946570966348787\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.02456220431414232,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.02456220431414232\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.25,\n \"acc_stderr\": 0.01751781884501444,\n \"acc_norm\"\
: 0.25,\n \"acc_norm_stderr\": 0.01751781884501444\n },\n \"harness|hendrycksTest-public_relations|5\"\
: {\n \"acc\": 0.24545454545454545,\n \"acc_stderr\": 0.04122066502878285,\n\
\ \"acc_norm\": 0.24545454545454545,\n \"acc_norm_stderr\": 0.04122066502878285\n\
\ },\n \"harness|hendrycksTest-security_studies|5\": {\n \"acc\": 0.19183673469387755,\n\
\ \"acc_stderr\": 0.025206963154225395,\n \"acc_norm\": 0.19183673469387755,\n\
\ \"acc_norm_stderr\": 0.025206963154225395\n },\n \"harness|hendrycksTest-sociology|5\"\
: {\n \"acc\": 0.22885572139303484,\n \"acc_stderr\": 0.029705284056772436,\n\
\ \"acc_norm\": 0.22885572139303484,\n \"acc_norm_stderr\": 0.029705284056772436\n\
\ },\n \"harness|hendrycksTest-us_foreign_policy|5\": {\n \"acc\":\
\ 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \"acc_norm\": 0.32,\n\
\ \"acc_norm_stderr\": 0.046882617226215034\n },\n \"harness|hendrycksTest-virology|5\"\
: {\n \"acc\": 0.28313253012048195,\n \"acc_stderr\": 0.03507295431370518,\n\
\ \"acc_norm\": 0.28313253012048195,\n \"acc_norm_stderr\": 0.03507295431370518\n\
\ },\n \"harness|hendrycksTest-world_religions|5\": {\n \"acc\": 0.2982456140350877,\n\
\ \"acc_stderr\": 0.03508771929824563,\n \"acc_norm\": 0.2982456140350877,\n\
\ \"acc_norm_stderr\": 0.03508771929824563\n },\n \"harness|truthfulqa:mc|0\"\
: {\n \"mc1\": 0.24724602203182375,\n \"mc1_stderr\": 0.015102404797359654,\n\
\ \"mc2\": 0.4825112031965409,\n \"mc2_stderr\": 0.01620968936835715\n\
\ },\n \"harness|winogrande|5\": {\n \"acc\": 0.516179952644041,\n\
\ \"acc_stderr\": 0.014045126130978601\n },\n \"harness|gsm8k|5\":\
\ {\n \"acc\": 0.0,\n \"acc_stderr\": 0.0\n }\n}\n```"
repo_url: https://huggingface.co/OEvortex/HelpingAI-110M
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|arc:challenge|25_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|gsm8k|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hellaswag|10_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T15-49-59.362653.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T15-49-59.362653.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- '**/details_harness|winogrande|5_2024-03-10T15-49-59.362653.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T15-49-59.362653.parquet'
- config_name: results
data_files:
- split: 2024_03_10T15_49_59.362653
path:
- results_2024-03-10T15-49-59.362653.parquet
- split: latest
path:
- results_2024-03-10T15-49-59.362653.parquet
---
# Dataset Card for Evaluation run of OEvortex/HelpingAI-110M
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [OEvortex/HelpingAI-110M](https://huggingface.co/OEvortex/HelpingAI-110M) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_OEvortex__HelpingAI-110M",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T15:49:59.362653](https://huggingface.co/datasets/open-llm-leaderboard/details_OEvortex__HelpingAI-110M/blob/main/results_2024-03-10T15-49-59.362653.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.23731668171739753,
"acc_stderr": 0.03006408174110251,
"acc_norm": 0.23714256037066517,
"acc_norm_stderr": 0.03085442849124215,
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359654,
"mc2": 0.4825112031965409,
"mc2_stderr": 0.01620968936835715
},
"harness|arc:challenge|25": {
"acc": 0.2022184300341297,
"acc_stderr": 0.011737454431872105,
"acc_norm": 0.22781569965870307,
"acc_norm_stderr": 0.012256708602326919
},
"harness|hellaswag|10": {
"acc": 0.27454690300736906,
"acc_stderr": 0.004453735900947837,
"acc_norm": 0.2802230631348337,
"acc_norm_stderr": 0.0044819026375056675
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.22,
"acc_stderr": 0.04163331998932268,
"acc_norm": 0.22,
"acc_norm_stderr": 0.04163331998932268
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.17037037037037037,
"acc_stderr": 0.032477811859955935,
"acc_norm": 0.17037037037037037,
"acc_norm_stderr": 0.032477811859955935
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.17763157894736842,
"acc_stderr": 0.031103182383123398,
"acc_norm": 0.17763157894736842,
"acc_norm_stderr": 0.031103182383123398
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768078,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768078
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.19622641509433963,
"acc_stderr": 0.024442388131100844,
"acc_norm": 0.19622641509433963,
"acc_norm_stderr": 0.024442388131100844
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.20833333333333334,
"acc_stderr": 0.033961162058453336,
"acc_norm": 0.20833333333333334,
"acc_norm_stderr": 0.033961162058453336
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.23,
"acc_stderr": 0.042295258468165085,
"acc_norm": 0.23,
"acc_norm_stderr": 0.042295258468165085
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.23,
"acc_stderr": 0.04229525846816505,
"acc_norm": 0.23,
"acc_norm_stderr": 0.04229525846816505
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.20809248554913296,
"acc_stderr": 0.030952890217749874,
"acc_norm": 0.20809248554913296,
"acc_norm_stderr": 0.030952890217749874
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.18627450980392157,
"acc_stderr": 0.03873958714149353,
"acc_norm": 0.18627450980392157,
"acc_norm_stderr": 0.03873958714149353
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.18,
"acc_stderr": 0.03861229196653697,
"acc_norm": 0.18,
"acc_norm_stderr": 0.03861229196653697
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.2680851063829787,
"acc_stderr": 0.02895734278834235,
"acc_norm": 0.2680851063829787,
"acc_norm_stderr": 0.02895734278834235
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.24561403508771928,
"acc_stderr": 0.0404933929774814,
"acc_norm": 0.24561403508771928,
"acc_norm_stderr": 0.0404933929774814
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.2827586206896552,
"acc_stderr": 0.03752833958003337,
"acc_norm": 0.2827586206896552,
"acc_norm_stderr": 0.03752833958003337
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918407,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918407
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.037184890068181146,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.037184890068181146
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.18,
"acc_stderr": 0.038612291966536934,
"acc_norm": 0.18,
"acc_norm_stderr": 0.038612291966536934
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.3,
"acc_stderr": 0.026069362295335137,
"acc_norm": 0.3,
"acc_norm_stderr": 0.026069362295335137
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.22167487684729065,
"acc_stderr": 0.0292255758924896,
"acc_norm": 0.22167487684729065,
"acc_norm_stderr": 0.0292255758924896
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.2,
"acc_stderr": 0.04020151261036846,
"acc_norm": 0.2,
"acc_norm_stderr": 0.04020151261036846
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.21818181818181817,
"acc_stderr": 0.03225078108306289,
"acc_norm": 0.21818181818181817,
"acc_norm_stderr": 0.03225078108306289
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21212121212121213,
"acc_stderr": 0.029126522834586825,
"acc_norm": 0.21212121212121213,
"acc_norm_stderr": 0.029126522834586825
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.19689119170984457,
"acc_stderr": 0.028697873971860674,
"acc_norm": 0.19689119170984457,
"acc_norm_stderr": 0.028697873971860674
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.2128205128205128,
"acc_stderr": 0.020752423722128002,
"acc_norm": 0.2128205128205128,
"acc_norm_stderr": 0.020752423722128002
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.026719240783712163,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.026719240783712163
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.20168067226890757,
"acc_stderr": 0.026064313406304523,
"acc_norm": 0.20168067226890757,
"acc_norm_stderr": 0.026064313406304523
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.24503311258278146,
"acc_stderr": 0.035118075718047245,
"acc_norm": 0.24503311258278146,
"acc_norm_stderr": 0.035118075718047245
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.1908256880733945,
"acc_stderr": 0.016847676400091112,
"acc_norm": 0.1908256880733945,
"acc_norm_stderr": 0.016847676400091112
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.39814814814814814,
"acc_stderr": 0.033384734032074016,
"acc_norm": 0.39814814814814814,
"acc_norm_stderr": 0.033384734032074016
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25,
"acc_stderr": 0.03039153369274154,
"acc_norm": 0.25,
"acc_norm_stderr": 0.03039153369274154
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.2742616033755274,
"acc_stderr": 0.02904133351059804,
"acc_norm": 0.2742616033755274,
"acc_norm_stderr": 0.02904133351059804
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3183856502242152,
"acc_stderr": 0.03126580522513713,
"acc_norm": 0.3183856502242152,
"acc_norm_stderr": 0.03126580522513713
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.2595419847328244,
"acc_stderr": 0.03844876139785271,
"acc_norm": 0.2595419847328244,
"acc_norm_stderr": 0.03844876139785271
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.03941897526516302,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.03941897526516302
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.2222222222222222,
"acc_stderr": 0.040191074725573483,
"acc_norm": 0.2222222222222222,
"acc_norm_stderr": 0.040191074725573483
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.24539877300613497,
"acc_stderr": 0.03380939813943354,
"acc_norm": 0.24539877300613497,
"acc_norm_stderr": 0.03380939813943354
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.2767857142857143,
"acc_stderr": 0.042466243366976256,
"acc_norm": 0.2767857142857143,
"acc_norm_stderr": 0.042466243366976256
},
"harness|hendrycksTest-management|5": {
"acc": 0.17475728155339806,
"acc_stderr": 0.037601780060266224,
"acc_norm": 0.17475728155339806,
"acc_norm_stderr": 0.037601780060266224
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.19230769230769232,
"acc_stderr": 0.025819233256483717,
"acc_norm": 0.19230769230769232,
"acc_norm_stderr": 0.025819233256483717
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2388250319284802,
"acc_stderr": 0.015246803197398687,
"acc_norm": 0.2388250319284802,
"acc_norm_stderr": 0.015246803197398687
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.24566473988439305,
"acc_stderr": 0.02317629820399201,
"acc_norm": 0.24566473988439305,
"acc_norm_stderr": 0.02317629820399201
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.23798882681564246,
"acc_stderr": 0.014242630070574915,
"acc_norm": 0.23798882681564246,
"acc_norm_stderr": 0.014242630070574915
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.20915032679738563,
"acc_stderr": 0.023287685312334806,
"acc_norm": 0.20915032679738563,
"acc_norm_stderr": 0.023287685312334806
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.1832797427652733,
"acc_stderr": 0.02197419884826581,
"acc_norm": 0.1832797427652733,
"acc_norm_stderr": 0.02197419884826581
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.21604938271604937,
"acc_stderr": 0.022899162918445806,
"acc_norm": 0.21604938271604937,
"acc_norm_stderr": 0.022899162918445806
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.22340425531914893,
"acc_stderr": 0.02484792135806396,
"acc_norm": 0.22340425531914893,
"acc_norm_stderr": 0.02484792135806396
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.242503259452412,
"acc_stderr": 0.010946570966348787,
"acc_norm": 0.242503259452412,
"acc_norm_stderr": 0.010946570966348787
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.02456220431414232,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.02456220431414232
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.25,
"acc_stderr": 0.01751781884501444,
"acc_norm": 0.25,
"acc_norm_stderr": 0.01751781884501444
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.24545454545454545,
"acc_stderr": 0.04122066502878285,
"acc_norm": 0.24545454545454545,
"acc_norm_stderr": 0.04122066502878285
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.19183673469387755,
"acc_stderr": 0.025206963154225395,
"acc_norm": 0.19183673469387755,
"acc_norm_stderr": 0.025206963154225395
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.22885572139303484,
"acc_stderr": 0.029705284056772436,
"acc_norm": 0.22885572139303484,
"acc_norm_stderr": 0.029705284056772436
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-virology|5": {
"acc": 0.28313253012048195,
"acc_stderr": 0.03507295431370518,
"acc_norm": 0.28313253012048195,
"acc_norm_stderr": 0.03507295431370518
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.2982456140350877,
"acc_stderr": 0.03508771929824563,
"acc_norm": 0.2982456140350877,
"acc_norm_stderr": 0.03508771929824563
},
"harness|truthfulqa:mc|0": {
"mc1": 0.24724602203182375,
"mc1_stderr": 0.015102404797359654,
"mc2": 0.4825112031965409,
"mc2_stderr": 0.01620968936835715
},
"harness|winogrande|5": {
"acc": 0.516179952644041,
"acc_stderr": 0.014045126130978601
},
"harness|gsm8k|5": {
"acc": 0.0,
"acc_stderr": 0.0
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
yuhsintw/TW_ED_exam | yuhsintw | "2024-03-10T23:59:19Z" | 0 | 3 | [
"task_categories:question-answering",
"language:zh",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us",
"Question Answering",
"Emergency Medicine",
"Taiwan",
"Board Exam",
"Traditional Chinese",
"Chinese"
] | [
"question-answering"
] | "2024-03-10T15:58:07Z" | ---
task_categories:
- question-answering
language:
- zh
tags:
- Question Answering
- Emergency Medicine
- Taiwan
- Board Exam
- Traditional Chinese
- Chinese
pretty_name: Taiwan Emergency Medicine Board Exam
size_categories:
- 10K<n<100K
--- |
eduvance/dpl | eduvance | "2024-03-10T16:10:48Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T16:02:42Z" | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
- name: text
dtype: string
splits:
- name: train
num_bytes: 40177.475409836065
num_examples: 42
- name: test
num_bytes: 18175.524590163935
num_examples: 19
download_size: 28393
dataset_size: 58353.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
Lonewolf2441139/gcdata_llama3_3000 | Lonewolf2441139 | "2024-03-10T16:13:44Z" | 0 | 0 | [
"size_categories:1K<n<10K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T16:13:34Z" | ---
dataset_info:
features:
- name: text
dtype: string
splits:
- name: train
num_bytes: 4836070
num_examples: 3077
download_size: 1723317
dataset_size: 4836070
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
nestymeee/filtered-imaterialist | nestymeee | "2024-03-10T17:00:11Z" | 0 | 1 | [
"license:mit",
"region:us"
] | null | "2024-03-10T16:15:17Z" | ---
license: mit
---
## Dataset for clothes segmentation
![plot](https://firebasestorage.googleapis.com/v0/b/aestyapp.appspot.com/o/user_looks%2FScreenshot%202024-03-10%20at%205.59.03%E2%80%AFPM.png?alt=media&token=0c3930b0-503c-443d-87cc-95d072126d34)
Based on [iMaterialist Dataset](https://www.kaggle.com/c/imaterialist-fashion-2019-FGVC6/data) with several adjustments:
1. Filtered images with more than 1 person with no labeled clothes
2. Compressed number of classes to 8: `'background', 'upperbody', 'upperbody_up', 'lowerbody', 'wholebody', 'wholebody_up', 'shoes', 'accesories'`
3. Simple structure with 2 folders: images 512x512 in `.jpg` and corresponding segmaps 512x512 in `.npy`
4. You can find example class and data vizualisation in `dataset.ipynb`
If you find any bugs, please contact me on: nestymeee@gmail.com |
valdineiarcenio/galvaobueno2 | valdineiarcenio | "2024-03-10T16:18:56Z" | 0 | 0 | [
"license:openrail",
"size_categories:n<1K",
"format:audiofolder",
"modality:audio",
"library:datasets",
"library:mlcroissant",
"region:us"
] | null | "2024-03-10T16:16:38Z" | ---
license: openrail
---
|
ryan2009/ph | ryan2009 | "2024-03-10T16:39:40Z" | 0 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-03-10T16:39:40Z" | ---
license: openrail
---
|
open-llm-leaderboard-old/details_Abhaykoul__HelpingAI-Lite-4x1b | open-llm-leaderboard-old | "2024-03-10T16:41:04Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T16:40:43Z" | ---
pretty_name: Evaluation run of Abhaykoul/HelpingAI-Lite-4x1b
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Abhaykoul/HelpingAI-Lite-4x1b](https://huggingface.co/Abhaykoul/HelpingAI-Lite-4x1b)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Abhaykoul__HelpingAI-Lite-4x1b\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T16:38:51.918017](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhaykoul__HelpingAI-Lite-4x1b/blob/main/results_2024-03-10T16-38-51.918017.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.25937273077878203,\n\
\ \"acc_stderr\": 0.030829064656615418,\n \"acc_norm\": 0.260299101923976,\n\
\ \"acc_norm_stderr\": 0.03157765952162184,\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3739297200017789,\n\
\ \"mc2_stderr\": 0.013865567135235702\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.34812286689419797,\n \"acc_stderr\": 0.013921008595179333,\n\
\ \"acc_norm\": 0.3583617747440273,\n \"acc_norm_stderr\": 0.01401288333485986\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.4576777534355706,\n\
\ \"acc_stderr\": 0.0049718741597776965,\n \"acc_norm\": 0.6100378410675165,\n\
\ \"acc_norm_stderr\": 0.004867445945277154\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \
\ \"acc_norm\": 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.15555555555555556,\n\
\ \"acc_stderr\": 0.03130948364878314,\n \"acc_norm\": 0.15555555555555556,\n\
\ \"acc_norm_stderr\": 0.03130948364878314\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.18421052631578946,\n \"acc_stderr\": 0.0315469804508223,\n\
\ \"acc_norm\": 0.18421052631578946,\n \"acc_norm_stderr\": 0.0315469804508223\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.04408440022768079,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.04408440022768079\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.2679245283018868,\n \"acc_stderr\": 0.027257260322494845,\n\
\ \"acc_norm\": 0.2679245283018868,\n \"acc_norm_stderr\": 0.027257260322494845\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.24305555555555555,\n\
\ \"acc_stderr\": 0.0358687928008034,\n \"acc_norm\": 0.24305555555555555,\n\
\ \"acc_norm_stderr\": 0.0358687928008034\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.27,\n \"acc_stderr\": 0.0446196043338474,\n \
\ \"acc_norm\": 0.27,\n \"acc_norm_stderr\": 0.0446196043338474\n },\n\
\ \"harness|hendrycksTest-college_computer_science|5\": {\n \"acc\": 0.26,\n\
\ \"acc_stderr\": 0.044084400227680794,\n \"acc_norm\": 0.26,\n \
\ \"acc_norm_stderr\": 0.044084400227680794\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.26,\n \"acc_stderr\": 0.0440844002276808,\n \
\ \"acc_norm\": 0.26,\n \"acc_norm_stderr\": 0.0440844002276808\n },\n\
\ \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.2023121387283237,\n\
\ \"acc_stderr\": 0.030631145539198826,\n \"acc_norm\": 0.2023121387283237,\n\
\ \"acc_norm_stderr\": 0.030631145539198826\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.20588235294117646,\n \"acc_stderr\": 0.04023382273617749,\n\
\ \"acc_norm\": 0.20588235294117646,\n \"acc_norm_stderr\": 0.04023382273617749\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.28,\n \"acc_stderr\": 0.04512608598542127,\n \"acc_norm\": 0.28,\n\
\ \"acc_norm_stderr\": 0.04512608598542127\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.24680851063829787,\n \"acc_stderr\": 0.0281854413012341,\n\
\ \"acc_norm\": 0.24680851063829787,\n \"acc_norm_stderr\": 0.0281854413012341\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.21052631578947367,\n\
\ \"acc_stderr\": 0.038351539543994194,\n \"acc_norm\": 0.21052631578947367,\n\
\ \"acc_norm_stderr\": 0.038351539543994194\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.22758620689655173,\n \"acc_stderr\": 0.03493950380131183,\n\
\ \"acc_norm\": 0.22758620689655173,\n \"acc_norm_stderr\": 0.03493950380131183\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.25925925925925924,\n \"acc_stderr\": 0.022569897074918417,\n \"\
acc_norm\": 0.25925925925925924,\n \"acc_norm_stderr\": 0.022569897074918417\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.24603174603174602,\n\
\ \"acc_stderr\": 0.038522733649243156,\n \"acc_norm\": 0.24603174603174602,\n\
\ \"acc_norm_stderr\": 0.038522733649243156\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.04560480215720683,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.04560480215720683\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.2161290322580645,\n\
\ \"acc_stderr\": 0.02341529343356852,\n \"acc_norm\": 0.2161290322580645,\n\
\ \"acc_norm_stderr\": 0.02341529343356852\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.21674876847290642,\n \"acc_stderr\": 0.028990331252516235,\n\
\ \"acc_norm\": 0.21674876847290642,\n \"acc_norm_stderr\": 0.028990331252516235\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.21,\n \"acc_stderr\": 0.040936018074033256,\n \"acc_norm\"\
: 0.21,\n \"acc_norm_stderr\": 0.040936018074033256\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.24242424242424243,\n \"acc_stderr\": 0.033464098810559534,\n\
\ \"acc_norm\": 0.24242424242424243,\n \"acc_norm_stderr\": 0.033464098810559534\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.21717171717171718,\n \"acc_stderr\": 0.029376616484945637,\n \"\
acc_norm\": 0.21717171717171718,\n \"acc_norm_stderr\": 0.029376616484945637\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.23316062176165803,\n \"acc_stderr\": 0.030516111371476008,\n\
\ \"acc_norm\": 0.23316062176165803,\n \"acc_norm_stderr\": 0.030516111371476008\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.26153846153846155,\n \"acc_stderr\": 0.022282141204204426,\n\
\ \"acc_norm\": 0.26153846153846155,\n \"acc_norm_stderr\": 0.022282141204204426\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.24814814814814815,\n \"acc_stderr\": 0.026335739404055803,\n \
\ \"acc_norm\": 0.24814814814814815,\n \"acc_norm_stderr\": 0.026335739404055803\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.24369747899159663,\n \"acc_stderr\": 0.027886828078380544,\n\
\ \"acc_norm\": 0.24369747899159663,\n \"acc_norm_stderr\": 0.027886828078380544\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.2052980132450331,\n \"acc_stderr\": 0.03297986648473836,\n \"\
acc_norm\": 0.2052980132450331,\n \"acc_norm_stderr\": 0.03297986648473836\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.24036697247706423,\n \"acc_stderr\": 0.01832060732096407,\n \"\
acc_norm\": 0.24036697247706423,\n \"acc_norm_stderr\": 0.01832060732096407\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.42592592592592593,\n \"acc_stderr\": 0.033723432716530624,\n \"\
acc_norm\": 0.42592592592592593,\n \"acc_norm_stderr\": 0.033723432716530624\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.25980392156862747,\n \"acc_stderr\": 0.03077855467869327,\n \"\
acc_norm\": 0.25980392156862747,\n \"acc_norm_stderr\": 0.03077855467869327\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.25316455696202533,\n \"acc_stderr\": 0.028304657943035307,\n \
\ \"acc_norm\": 0.25316455696202533,\n \"acc_norm_stderr\": 0.028304657943035307\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.3542600896860987,\n\
\ \"acc_stderr\": 0.032100621541349864,\n \"acc_norm\": 0.3542600896860987,\n\
\ \"acc_norm_stderr\": 0.032100621541349864\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.26717557251908397,\n \"acc_stderr\": 0.03880848301082396,\n\
\ \"acc_norm\": 0.26717557251908397,\n \"acc_norm_stderr\": 0.03880848301082396\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.24793388429752067,\n \"acc_stderr\": 0.039418975265163025,\n \"\
acc_norm\": 0.24793388429752067,\n \"acc_norm_stderr\": 0.039418975265163025\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.23148148148148148,\n\
\ \"acc_stderr\": 0.04077494709252626,\n \"acc_norm\": 0.23148148148148148,\n\
\ \"acc_norm_stderr\": 0.04077494709252626\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.22699386503067484,\n \"acc_stderr\": 0.032910995786157686,\n\
\ \"acc_norm\": 0.22699386503067484,\n \"acc_norm_stderr\": 0.032910995786157686\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.30357142857142855,\n\
\ \"acc_stderr\": 0.04364226155841044,\n \"acc_norm\": 0.30357142857142855,\n\
\ \"acc_norm_stderr\": 0.04364226155841044\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.24271844660194175,\n \"acc_stderr\": 0.04245022486384493,\n\
\ \"acc_norm\": 0.24271844660194175,\n \"acc_norm_stderr\": 0.04245022486384493\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.27350427350427353,\n\
\ \"acc_stderr\": 0.029202540153431166,\n \"acc_norm\": 0.27350427350427353,\n\
\ \"acc_norm_stderr\": 0.029202540153431166\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.2822477650063857,\n\
\ \"acc_stderr\": 0.016095302969878565,\n \"acc_norm\": 0.2822477650063857,\n\
\ \"acc_norm_stderr\": 0.016095302969878565\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.21676300578034682,\n \"acc_stderr\": 0.02218347766841285,\n\
\ \"acc_norm\": 0.21676300578034682,\n \"acc_norm_stderr\": 0.02218347766841285\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.24804469273743016,\n\
\ \"acc_stderr\": 0.014444157808261427,\n \"acc_norm\": 0.24804469273743016,\n\
\ \"acc_norm_stderr\": 0.014444157808261427\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.24836601307189543,\n \"acc_stderr\": 0.024739981355113592,\n\
\ \"acc_norm\": 0.24836601307189543,\n \"acc_norm_stderr\": 0.024739981355113592\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.2733118971061093,\n\
\ \"acc_stderr\": 0.02531176597542612,\n \"acc_norm\": 0.2733118971061093,\n\
\ \"acc_norm_stderr\": 0.02531176597542612\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.2623456790123457,\n \"acc_stderr\": 0.02447722285613511,\n\
\ \"acc_norm\": 0.2623456790123457,\n \"acc_norm_stderr\": 0.02447722285613511\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.2765957446808511,\n \"acc_stderr\": 0.02668456434046099,\n \
\ \"acc_norm\": 0.2765957446808511,\n \"acc_norm_stderr\": 0.02668456434046099\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.23598435462842243,\n\
\ \"acc_stderr\": 0.010844802669662689,\n \"acc_norm\": 0.23598435462842243,\n\
\ \"acc_norm_stderr\": 0.010844802669662689\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.2757352941176471,\n \"acc_stderr\": 0.027146271936625162,\n\
\ \"acc_norm\": 0.2757352941176471,\n \"acc_norm_stderr\": 0.027146271936625162\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.2581699346405229,\n \"acc_stderr\": 0.017704531653250075,\n \
\ \"acc_norm\": 0.2581699346405229,\n \"acc_norm_stderr\": 0.017704531653250075\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.2818181818181818,\n\
\ \"acc_stderr\": 0.043091187099464585,\n \"acc_norm\": 0.2818181818181818,\n\
\ \"acc_norm_stderr\": 0.043091187099464585\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.13877551020408163,\n \"acc_stderr\": 0.022131950419972655,\n\
\ \"acc_norm\": 0.13877551020408163,\n \"acc_norm_stderr\": 0.022131950419972655\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.24378109452736318,\n\
\ \"acc_stderr\": 0.030360490154014645,\n \"acc_norm\": 0.24378109452736318,\n\
\ \"acc_norm_stderr\": 0.030360490154014645\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.3,\n \"acc_stderr\": 0.046056618647183814,\n \
\ \"acc_norm\": 0.3,\n \"acc_norm_stderr\": 0.046056618647183814\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.3253012048192771,\n\
\ \"acc_stderr\": 0.036471685236832266,\n \"acc_norm\": 0.3253012048192771,\n\
\ \"acc_norm_stderr\": 0.036471685236832266\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.3216374269005848,\n \"acc_stderr\": 0.03582529442573122,\n\
\ \"acc_norm\": 0.3216374269005848,\n \"acc_norm_stderr\": 0.03582529442573122\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.22888616891064872,\n\
\ \"mc1_stderr\": 0.014706994909055027,\n \"mc2\": 0.3739297200017789,\n\
\ \"mc2_stderr\": 0.013865567135235702\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.6077348066298343,\n \"acc_stderr\": 0.013722400462000885\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.018953752843062926,\n \
\ \"acc_stderr\": 0.0037560783410314712\n }\n}\n```"
repo_url: https://huggingface.co/Abhaykoul/HelpingAI-Lite-4x1b
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|arc:challenge|25_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|gsm8k|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hellaswag|10_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-38-51.918017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T16-38-51.918017.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- '**/details_harness|winogrande|5_2024-03-10T16-38-51.918017.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T16-38-51.918017.parquet'
- config_name: results
data_files:
- split: 2024_03_10T16_38_51.918017
path:
- results_2024-03-10T16-38-51.918017.parquet
- split: latest
path:
- results_2024-03-10T16-38-51.918017.parquet
---
# Dataset Card for Evaluation run of Abhaykoul/HelpingAI-Lite-4x1b
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Abhaykoul/HelpingAI-Lite-4x1b](https://huggingface.co/Abhaykoul/HelpingAI-Lite-4x1b) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Abhaykoul__HelpingAI-Lite-4x1b",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T16:38:51.918017](https://huggingface.co/datasets/open-llm-leaderboard/details_Abhaykoul__HelpingAI-Lite-4x1b/blob/main/results_2024-03-10T16-38-51.918017.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.25937273077878203,
"acc_stderr": 0.030829064656615418,
"acc_norm": 0.260299101923976,
"acc_norm_stderr": 0.03157765952162184,
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3739297200017789,
"mc2_stderr": 0.013865567135235702
},
"harness|arc:challenge|25": {
"acc": 0.34812286689419797,
"acc_stderr": 0.013921008595179333,
"acc_norm": 0.3583617747440273,
"acc_norm_stderr": 0.01401288333485986
},
"harness|hellaswag|10": {
"acc": 0.4576777534355706,
"acc_stderr": 0.0049718741597776965,
"acc_norm": 0.6100378410675165,
"acc_norm_stderr": 0.004867445945277154
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.15555555555555556,
"acc_stderr": 0.03130948364878314,
"acc_norm": 0.15555555555555556,
"acc_norm_stderr": 0.03130948364878314
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.18421052631578946,
"acc_stderr": 0.0315469804508223,
"acc_norm": 0.18421052631578946,
"acc_norm_stderr": 0.0315469804508223
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.26,
"acc_stderr": 0.04408440022768079,
"acc_norm": 0.26,
"acc_norm_stderr": 0.04408440022768079
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.2679245283018868,
"acc_stderr": 0.027257260322494845,
"acc_norm": 0.2679245283018868,
"acc_norm_stderr": 0.027257260322494845
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.24305555555555555,
"acc_stderr": 0.0358687928008034,
"acc_norm": 0.24305555555555555,
"acc_norm_stderr": 0.0358687928008034
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.27,
"acc_stderr": 0.0446196043338474,
"acc_norm": 0.27,
"acc_norm_stderr": 0.0446196043338474
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.26,
"acc_stderr": 0.044084400227680794,
"acc_norm": 0.26,
"acc_norm_stderr": 0.044084400227680794
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.26,
"acc_stderr": 0.0440844002276808,
"acc_norm": 0.26,
"acc_norm_stderr": 0.0440844002276808
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.2023121387283237,
"acc_stderr": 0.030631145539198826,
"acc_norm": 0.2023121387283237,
"acc_norm_stderr": 0.030631145539198826
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.20588235294117646,
"acc_stderr": 0.04023382273617749,
"acc_norm": 0.20588235294117646,
"acc_norm_stderr": 0.04023382273617749
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.28,
"acc_stderr": 0.04512608598542127,
"acc_norm": 0.28,
"acc_norm_stderr": 0.04512608598542127
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.24680851063829787,
"acc_stderr": 0.0281854413012341,
"acc_norm": 0.24680851063829787,
"acc_norm_stderr": 0.0281854413012341
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.21052631578947367,
"acc_stderr": 0.038351539543994194,
"acc_norm": 0.21052631578947367,
"acc_norm_stderr": 0.038351539543994194
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.22758620689655173,
"acc_stderr": 0.03493950380131183,
"acc_norm": 0.22758620689655173,
"acc_norm_stderr": 0.03493950380131183
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.25925925925925924,
"acc_stderr": 0.022569897074918417,
"acc_norm": 0.25925925925925924,
"acc_norm_stderr": 0.022569897074918417
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.24603174603174602,
"acc_stderr": 0.038522733649243156,
"acc_norm": 0.24603174603174602,
"acc_norm_stderr": 0.038522733649243156
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.29,
"acc_stderr": 0.04560480215720683,
"acc_norm": 0.29,
"acc_norm_stderr": 0.04560480215720683
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.2161290322580645,
"acc_stderr": 0.02341529343356852,
"acc_norm": 0.2161290322580645,
"acc_norm_stderr": 0.02341529343356852
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.21674876847290642,
"acc_stderr": 0.028990331252516235,
"acc_norm": 0.21674876847290642,
"acc_norm_stderr": 0.028990331252516235
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.21,
"acc_stderr": 0.040936018074033256,
"acc_norm": 0.21,
"acc_norm_stderr": 0.040936018074033256
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.24242424242424243,
"acc_stderr": 0.033464098810559534,
"acc_norm": 0.24242424242424243,
"acc_norm_stderr": 0.033464098810559534
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.21717171717171718,
"acc_stderr": 0.029376616484945637,
"acc_norm": 0.21717171717171718,
"acc_norm_stderr": 0.029376616484945637
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.23316062176165803,
"acc_stderr": 0.030516111371476008,
"acc_norm": 0.23316062176165803,
"acc_norm_stderr": 0.030516111371476008
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.26153846153846155,
"acc_stderr": 0.022282141204204426,
"acc_norm": 0.26153846153846155,
"acc_norm_stderr": 0.022282141204204426
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.24814814814814815,
"acc_stderr": 0.026335739404055803,
"acc_norm": 0.24814814814814815,
"acc_norm_stderr": 0.026335739404055803
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.24369747899159663,
"acc_stderr": 0.027886828078380544,
"acc_norm": 0.24369747899159663,
"acc_norm_stderr": 0.027886828078380544
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.2052980132450331,
"acc_stderr": 0.03297986648473836,
"acc_norm": 0.2052980132450331,
"acc_norm_stderr": 0.03297986648473836
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.24036697247706423,
"acc_stderr": 0.01832060732096407,
"acc_norm": 0.24036697247706423,
"acc_norm_stderr": 0.01832060732096407
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.42592592592592593,
"acc_stderr": 0.033723432716530624,
"acc_norm": 0.42592592592592593,
"acc_norm_stderr": 0.033723432716530624
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.25980392156862747,
"acc_stderr": 0.03077855467869327,
"acc_norm": 0.25980392156862747,
"acc_norm_stderr": 0.03077855467869327
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.25316455696202533,
"acc_stderr": 0.028304657943035307,
"acc_norm": 0.25316455696202533,
"acc_norm_stderr": 0.028304657943035307
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.3542600896860987,
"acc_stderr": 0.032100621541349864,
"acc_norm": 0.3542600896860987,
"acc_norm_stderr": 0.032100621541349864
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.26717557251908397,
"acc_stderr": 0.03880848301082396,
"acc_norm": 0.26717557251908397,
"acc_norm_stderr": 0.03880848301082396
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.24793388429752067,
"acc_stderr": 0.039418975265163025,
"acc_norm": 0.24793388429752067,
"acc_norm_stderr": 0.039418975265163025
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.23148148148148148,
"acc_stderr": 0.04077494709252626,
"acc_norm": 0.23148148148148148,
"acc_norm_stderr": 0.04077494709252626
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.22699386503067484,
"acc_stderr": 0.032910995786157686,
"acc_norm": 0.22699386503067484,
"acc_norm_stderr": 0.032910995786157686
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.30357142857142855,
"acc_stderr": 0.04364226155841044,
"acc_norm": 0.30357142857142855,
"acc_norm_stderr": 0.04364226155841044
},
"harness|hendrycksTest-management|5": {
"acc": 0.24271844660194175,
"acc_stderr": 0.04245022486384493,
"acc_norm": 0.24271844660194175,
"acc_norm_stderr": 0.04245022486384493
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.27350427350427353,
"acc_stderr": 0.029202540153431166,
"acc_norm": 0.27350427350427353,
"acc_norm_stderr": 0.029202540153431166
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.2822477650063857,
"acc_stderr": 0.016095302969878565,
"acc_norm": 0.2822477650063857,
"acc_norm_stderr": 0.016095302969878565
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.21676300578034682,
"acc_stderr": 0.02218347766841285,
"acc_norm": 0.21676300578034682,
"acc_norm_stderr": 0.02218347766841285
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.24804469273743016,
"acc_stderr": 0.014444157808261427,
"acc_norm": 0.24804469273743016,
"acc_norm_stderr": 0.014444157808261427
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.24836601307189543,
"acc_stderr": 0.024739981355113592,
"acc_norm": 0.24836601307189543,
"acc_norm_stderr": 0.024739981355113592
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.2733118971061093,
"acc_stderr": 0.02531176597542612,
"acc_norm": 0.2733118971061093,
"acc_norm_stderr": 0.02531176597542612
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.2623456790123457,
"acc_stderr": 0.02447722285613511,
"acc_norm": 0.2623456790123457,
"acc_norm_stderr": 0.02447722285613511
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.2765957446808511,
"acc_stderr": 0.02668456434046099,
"acc_norm": 0.2765957446808511,
"acc_norm_stderr": 0.02668456434046099
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.23598435462842243,
"acc_stderr": 0.010844802669662689,
"acc_norm": 0.23598435462842243,
"acc_norm_stderr": 0.010844802669662689
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.2757352941176471,
"acc_stderr": 0.027146271936625162,
"acc_norm": 0.2757352941176471,
"acc_norm_stderr": 0.027146271936625162
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.2581699346405229,
"acc_stderr": 0.017704531653250075,
"acc_norm": 0.2581699346405229,
"acc_norm_stderr": 0.017704531653250075
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.2818181818181818,
"acc_stderr": 0.043091187099464585,
"acc_norm": 0.2818181818181818,
"acc_norm_stderr": 0.043091187099464585
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.13877551020408163,
"acc_stderr": 0.022131950419972655,
"acc_norm": 0.13877551020408163,
"acc_norm_stderr": 0.022131950419972655
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.24378109452736318,
"acc_stderr": 0.030360490154014645,
"acc_norm": 0.24378109452736318,
"acc_norm_stderr": 0.030360490154014645
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-virology|5": {
"acc": 0.3253012048192771,
"acc_stderr": 0.036471685236832266,
"acc_norm": 0.3253012048192771,
"acc_norm_stderr": 0.036471685236832266
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.3216374269005848,
"acc_stderr": 0.03582529442573122,
"acc_norm": 0.3216374269005848,
"acc_norm_stderr": 0.03582529442573122
},
"harness|truthfulqa:mc|0": {
"mc1": 0.22888616891064872,
"mc1_stderr": 0.014706994909055027,
"mc2": 0.3739297200017789,
"mc2_stderr": 0.013865567135235702
},
"harness|winogrande|5": {
"acc": 0.6077348066298343,
"acc_stderr": 0.013722400462000885
},
"harness|gsm8k|5": {
"acc": 0.018953752843062926,
"acc_stderr": 0.0037560783410314712
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
adleme94/chocopedia | adleme94 | "2024-03-10T16:46:40Z" | 0 | 0 | [
"license:mit",
"region:us"
] | null | "2024-03-10T16:46:40Z" | ---
license: mit
---
|
open-llm-leaderboard-old/details_grimjim__kuno-kunoichi-v1-DPO-v2-SLERP-7B | open-llm-leaderboard-old | "2024-03-10T16:55:02Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T16:49:24Z" | ---
pretty_name: Evaluation run of grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B](https://huggingface.co/grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_grimjim__kuno-kunoichi-v1-DPO-v2-SLERP-7B\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T16:52:41.232244](https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kuno-kunoichi-v1-DPO-v2-SLERP-7B/blob/main/results_2024-03-10T16-52-41.232244.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6521245844299478,\n\
\ \"acc_stderr\": 0.03206483289505714,\n \"acc_norm\": 0.6525873691124378,\n\
\ \"acc_norm_stderr\": 0.03271832415925097,\n \"mc1\": 0.4908200734394125,\n\
\ \"mc1_stderr\": 0.01750055072481975,\n \"mc2\": 0.6512373349905823,\n\
\ \"mc2_stderr\": 0.015414384395752878\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6646757679180887,\n \"acc_stderr\": 0.01379618294778556,\n\
\ \"acc_norm\": 0.6911262798634812,\n \"acc_norm_stderr\": 0.013501770929344003\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7005576578370842,\n\
\ \"acc_stderr\": 0.004570777326263903,\n \"acc_norm\": 0.8733320055765784,\n\
\ \"acc_norm_stderr\": 0.003319209400135123\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.0479372485441102,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.0479372485441102\n },\n\
\ \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6296296296296297,\n\
\ \"acc_stderr\": 0.041716541613545426,\n \"acc_norm\": 0.6296296296296297,\n\
\ \"acc_norm_stderr\": 0.041716541613545426\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6776315789473685,\n \"acc_stderr\": 0.03803510248351585,\n\
\ \"acc_norm\": 0.6776315789473685,\n \"acc_norm_stderr\": 0.03803510248351585\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.59,\n\
\ \"acc_stderr\": 0.049431107042371025,\n \"acc_norm\": 0.59,\n \
\ \"acc_norm_stderr\": 0.049431107042371025\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569526,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569526\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7638888888888888,\n\
\ \"acc_stderr\": 0.03551446610810826,\n \"acc_norm\": 0.7638888888888888,\n\
\ \"acc_norm_stderr\": 0.03551446610810826\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.5,\n \"acc_stderr\": 0.050251890762960605,\n \
\ \"acc_norm\": 0.5,\n \"acc_norm_stderr\": 0.050251890762960605\n \
\ },\n \"harness|hendrycksTest-college_mathematics|5\": {\n \"acc\": 0.3,\n\
\ \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\": 0.3,\n \
\ \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-college_medicine|5\"\
: {\n \"acc\": 0.6878612716763006,\n \"acc_stderr\": 0.03533133389323657,\n\
\ \"acc_norm\": 0.6878612716763006,\n \"acc_norm_stderr\": 0.03533133389323657\n\
\ },\n \"harness|hendrycksTest-college_physics|5\": {\n \"acc\": 0.4411764705882353,\n\
\ \"acc_stderr\": 0.049406356306056595,\n \"acc_norm\": 0.4411764705882353,\n\
\ \"acc_norm_stderr\": 0.049406356306056595\n },\n \"harness|hendrycksTest-computer_security|5\"\
: {\n \"acc\": 0.77,\n \"acc_stderr\": 0.04229525846816506,\n \
\ \"acc_norm\": 0.77,\n \"acc_norm_stderr\": 0.04229525846816506\n \
\ },\n \"harness|hendrycksTest-conceptual_physics|5\": {\n \"acc\": 0.5957446808510638,\n\
\ \"acc_stderr\": 0.03208115750788684,\n \"acc_norm\": 0.5957446808510638,\n\
\ \"acc_norm_stderr\": 0.03208115750788684\n },\n \"harness|hendrycksTest-econometrics|5\"\
: {\n \"acc\": 0.49122807017543857,\n \"acc_stderr\": 0.04702880432049615,\n\
\ \"acc_norm\": 0.49122807017543857,\n \"acc_norm_stderr\": 0.04702880432049615\n\
\ },\n \"harness|hendrycksTest-electrical_engineering|5\": {\n \"acc\"\
: 0.5379310344827586,\n \"acc_stderr\": 0.04154659671707548,\n \"\
acc_norm\": 0.5379310344827586,\n \"acc_norm_stderr\": 0.04154659671707548\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.3968253968253968,\n \"acc_stderr\": 0.02519710107424649,\n \"\
acc_norm\": 0.3968253968253968,\n \"acc_norm_stderr\": 0.02519710107424649\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.46825396825396826,\n\
\ \"acc_stderr\": 0.04463112720677171,\n \"acc_norm\": 0.46825396825396826,\n\
\ \"acc_norm_stderr\": 0.04463112720677171\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.35,\n \"acc_stderr\": 0.047937248544110196,\n \
\ \"acc_norm\": 0.35,\n \"acc_norm_stderr\": 0.047937248544110196\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7967741935483871,\n \"acc_stderr\": 0.022891687984554963,\n \"\
acc_norm\": 0.7967741935483871,\n \"acc_norm_stderr\": 0.022891687984554963\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.5024630541871922,\n \"acc_stderr\": 0.035179450386910616,\n \"\
acc_norm\": 0.5024630541871922,\n \"acc_norm_stderr\": 0.035179450386910616\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \"acc_norm\"\
: 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7828282828282829,\n \"acc_stderr\": 0.02937661648494563,\n \"\
acc_norm\": 0.7828282828282829,\n \"acc_norm_stderr\": 0.02937661648494563\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9015544041450777,\n \"acc_stderr\": 0.021500249576033484,\n\
\ \"acc_norm\": 0.9015544041450777,\n \"acc_norm_stderr\": 0.021500249576033484\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6743589743589744,\n \"acc_stderr\": 0.02375966576741229,\n \
\ \"acc_norm\": 0.6743589743589744,\n \"acc_norm_stderr\": 0.02375966576741229\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131154,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131154\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.029344572500634335,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.029344572500634335\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.3708609271523179,\n \"acc_stderr\": 0.03943966699183629,\n \"\
acc_norm\": 0.3708609271523179,\n \"acc_norm_stderr\": 0.03943966699183629\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8495412844036697,\n \"acc_stderr\": 0.015328563932669237,\n \"\
acc_norm\": 0.8495412844036697,\n \"acc_norm_stderr\": 0.015328563932669237\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5231481481481481,\n \"acc_stderr\": 0.03406315360711507,\n \"\
acc_norm\": 0.5231481481481481,\n \"acc_norm_stderr\": 0.03406315360711507\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8382352941176471,\n \"acc_stderr\": 0.025845017986926917,\n \"\
acc_norm\": 0.8382352941176471,\n \"acc_norm_stderr\": 0.025845017986926917\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8016877637130801,\n \"acc_stderr\": 0.025955020841621112,\n \
\ \"acc_norm\": 0.8016877637130801,\n \"acc_norm_stderr\": 0.025955020841621112\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7633587786259542,\n \"acc_stderr\": 0.03727673575596913,\n\
\ \"acc_norm\": 0.7633587786259542,\n \"acc_norm_stderr\": 0.03727673575596913\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.8148148148148148,\n\
\ \"acc_stderr\": 0.03755265865037181,\n \"acc_norm\": 0.8148148148148148,\n\
\ \"acc_norm_stderr\": 0.03755265865037181\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7975460122699386,\n \"acc_stderr\": 0.031570650789119005,\n\
\ \"acc_norm\": 0.7975460122699386,\n \"acc_norm_stderr\": 0.031570650789119005\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.45535714285714285,\n\
\ \"acc_stderr\": 0.047268355537191,\n \"acc_norm\": 0.45535714285714285,\n\
\ \"acc_norm_stderr\": 0.047268355537191\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7864077669902912,\n \"acc_stderr\": 0.040580420156460344,\n\
\ \"acc_norm\": 0.7864077669902912,\n \"acc_norm_stderr\": 0.040580420156460344\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8632478632478633,\n\
\ \"acc_stderr\": 0.022509033937077802,\n \"acc_norm\": 0.8632478632478633,\n\
\ \"acc_norm_stderr\": 0.022509033937077802\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.73,\n \"acc_stderr\": 0.044619604333847394,\n \
\ \"acc_norm\": 0.73,\n \"acc_norm_stderr\": 0.044619604333847394\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8352490421455939,\n\
\ \"acc_stderr\": 0.013265346261323797,\n \"acc_norm\": 0.8352490421455939,\n\
\ \"acc_norm_stderr\": 0.013265346261323797\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7456647398843931,\n \"acc_stderr\": 0.023445826276545543,\n\
\ \"acc_norm\": 0.7456647398843931,\n \"acc_norm_stderr\": 0.023445826276545543\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4491620111731844,\n\
\ \"acc_stderr\": 0.016635838341631928,\n \"acc_norm\": 0.4491620111731844,\n\
\ \"acc_norm_stderr\": 0.016635838341631928\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7254901960784313,\n \"acc_stderr\": 0.02555316999182652,\n\
\ \"acc_norm\": 0.7254901960784313,\n \"acc_norm_stderr\": 0.02555316999182652\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.6977491961414791,\n\
\ \"acc_stderr\": 0.026082700695399665,\n \"acc_norm\": 0.6977491961414791,\n\
\ \"acc_norm_stderr\": 0.026082700695399665\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7438271604938271,\n \"acc_stderr\": 0.024288533637726095,\n\
\ \"acc_norm\": 0.7438271604938271,\n \"acc_norm_stderr\": 0.024288533637726095\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4929078014184397,\n \"acc_stderr\": 0.02982449855912901,\n \
\ \"acc_norm\": 0.4929078014184397,\n \"acc_norm_stderr\": 0.02982449855912901\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.46870925684485004,\n\
\ \"acc_stderr\": 0.012745204626083136,\n \"acc_norm\": 0.46870925684485004,\n\
\ \"acc_norm_stderr\": 0.012745204626083136\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6985294117647058,\n \"acc_stderr\": 0.027875982114273168,\n\
\ \"acc_norm\": 0.6985294117647058,\n \"acc_norm_stderr\": 0.027875982114273168\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6535947712418301,\n \"acc_stderr\": 0.01924978569171721,\n \
\ \"acc_norm\": 0.6535947712418301,\n \"acc_norm_stderr\": 0.01924978569171721\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7306122448979592,\n \"acc_stderr\": 0.02840125202902294,\n\
\ \"acc_norm\": 0.7306122448979592,\n \"acc_norm_stderr\": 0.02840125202902294\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.845771144278607,\n\
\ \"acc_stderr\": 0.025538433368578327,\n \"acc_norm\": 0.845771144278607,\n\
\ \"acc_norm_stderr\": 0.025538433368578327\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5301204819277109,\n\
\ \"acc_stderr\": 0.03885425420866767,\n \"acc_norm\": 0.5301204819277109,\n\
\ \"acc_norm_stderr\": 0.03885425420866767\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8538011695906432,\n \"acc_stderr\": 0.027097290118070806,\n\
\ \"acc_norm\": 0.8538011695906432,\n \"acc_norm_stderr\": 0.027097290118070806\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4908200734394125,\n\
\ \"mc1_stderr\": 0.01750055072481975,\n \"mc2\": 0.6512373349905823,\n\
\ \"mc2_stderr\": 0.015414384395752878\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8089976322020521,\n \"acc_stderr\": 0.011047808761510427\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6671721000758151,\n \
\ \"acc_stderr\": 0.012979892496598283\n }\n}\n```"
repo_url: https://huggingface.co/grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|arc:challenge|25_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|arc:challenge|25_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|gsm8k|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|gsm8k|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hellaswag|10_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hellaswag|10_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-47-06.900885.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-52-41.232244.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T16-52-41.232244.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- '**/details_harness|winogrande|5_2024-03-10T16-47-06.900885.parquet'
- split: 2024_03_10T16_52_41.232244
path:
- '**/details_harness|winogrande|5_2024-03-10T16-52-41.232244.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T16-52-41.232244.parquet'
- config_name: results
data_files:
- split: 2024_03_10T16_47_06.900885
path:
- results_2024-03-10T16-47-06.900885.parquet
- split: 2024_03_10T16_52_41.232244
path:
- results_2024-03-10T16-52-41.232244.parquet
- split: latest
path:
- results_2024-03-10T16-52-41.232244.parquet
---
# Dataset Card for Evaluation run of grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B](https://huggingface.co/grimjim/kuno-kunoichi-v1-DPO-v2-SLERP-7B) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_grimjim__kuno-kunoichi-v1-DPO-v2-SLERP-7B",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T16:52:41.232244](https://huggingface.co/datasets/open-llm-leaderboard/details_grimjim__kuno-kunoichi-v1-DPO-v2-SLERP-7B/blob/main/results_2024-03-10T16-52-41.232244.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6521245844299478,
"acc_stderr": 0.03206483289505714,
"acc_norm": 0.6525873691124378,
"acc_norm_stderr": 0.03271832415925097,
"mc1": 0.4908200734394125,
"mc1_stderr": 0.01750055072481975,
"mc2": 0.6512373349905823,
"mc2_stderr": 0.015414384395752878
},
"harness|arc:challenge|25": {
"acc": 0.6646757679180887,
"acc_stderr": 0.01379618294778556,
"acc_norm": 0.6911262798634812,
"acc_norm_stderr": 0.013501770929344003
},
"harness|hellaswag|10": {
"acc": 0.7005576578370842,
"acc_stderr": 0.004570777326263903,
"acc_norm": 0.8733320055765784,
"acc_norm_stderr": 0.003319209400135123
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.35,
"acc_stderr": 0.0479372485441102,
"acc_norm": 0.35,
"acc_norm_stderr": 0.0479372485441102
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6296296296296297,
"acc_stderr": 0.041716541613545426,
"acc_norm": 0.6296296296296297,
"acc_norm_stderr": 0.041716541613545426
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6776315789473685,
"acc_stderr": 0.03803510248351585,
"acc_norm": 0.6776315789473685,
"acc_norm_stderr": 0.03803510248351585
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.59,
"acc_stderr": 0.049431107042371025,
"acc_norm": 0.59,
"acc_norm_stderr": 0.049431107042371025
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569526,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569526
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7638888888888888,
"acc_stderr": 0.03551446610810826,
"acc_norm": 0.7638888888888888,
"acc_norm_stderr": 0.03551446610810826
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.5,
"acc_stderr": 0.050251890762960605,
"acc_norm": 0.5,
"acc_norm_stderr": 0.050251890762960605
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.3,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.3,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6878612716763006,
"acc_stderr": 0.03533133389323657,
"acc_norm": 0.6878612716763006,
"acc_norm_stderr": 0.03533133389323657
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4411764705882353,
"acc_stderr": 0.049406356306056595,
"acc_norm": 0.4411764705882353,
"acc_norm_stderr": 0.049406356306056595
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.77,
"acc_stderr": 0.04229525846816506,
"acc_norm": 0.77,
"acc_norm_stderr": 0.04229525846816506
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5957446808510638,
"acc_stderr": 0.03208115750788684,
"acc_norm": 0.5957446808510638,
"acc_norm_stderr": 0.03208115750788684
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.49122807017543857,
"acc_stderr": 0.04702880432049615,
"acc_norm": 0.49122807017543857,
"acc_norm_stderr": 0.04702880432049615
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5379310344827586,
"acc_stderr": 0.04154659671707548,
"acc_norm": 0.5379310344827586,
"acc_norm_stderr": 0.04154659671707548
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.3968253968253968,
"acc_stderr": 0.02519710107424649,
"acc_norm": 0.3968253968253968,
"acc_norm_stderr": 0.02519710107424649
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.46825396825396826,
"acc_stderr": 0.04463112720677171,
"acc_norm": 0.46825396825396826,
"acc_norm_stderr": 0.04463112720677171
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.35,
"acc_stderr": 0.047937248544110196,
"acc_norm": 0.35,
"acc_norm_stderr": 0.047937248544110196
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7967741935483871,
"acc_stderr": 0.022891687984554963,
"acc_norm": 0.7967741935483871,
"acc_norm_stderr": 0.022891687984554963
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5024630541871922,
"acc_stderr": 0.035179450386910616,
"acc_norm": 0.5024630541871922,
"acc_norm_stderr": 0.035179450386910616
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7828282828282829,
"acc_stderr": 0.02937661648494563,
"acc_norm": 0.7828282828282829,
"acc_norm_stderr": 0.02937661648494563
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9015544041450777,
"acc_stderr": 0.021500249576033484,
"acc_norm": 0.9015544041450777,
"acc_norm_stderr": 0.021500249576033484
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6743589743589744,
"acc_stderr": 0.02375966576741229,
"acc_norm": 0.6743589743589744,
"acc_norm_stderr": 0.02375966576741229
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131154,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131154
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.029344572500634335,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.029344572500634335
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.3708609271523179,
"acc_stderr": 0.03943966699183629,
"acc_norm": 0.3708609271523179,
"acc_norm_stderr": 0.03943966699183629
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8495412844036697,
"acc_stderr": 0.015328563932669237,
"acc_norm": 0.8495412844036697,
"acc_norm_stderr": 0.015328563932669237
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5231481481481481,
"acc_stderr": 0.03406315360711507,
"acc_norm": 0.5231481481481481,
"acc_norm_stderr": 0.03406315360711507
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8382352941176471,
"acc_stderr": 0.025845017986926917,
"acc_norm": 0.8382352941176471,
"acc_norm_stderr": 0.025845017986926917
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8016877637130801,
"acc_stderr": 0.025955020841621112,
"acc_norm": 0.8016877637130801,
"acc_norm_stderr": 0.025955020841621112
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7633587786259542,
"acc_stderr": 0.03727673575596913,
"acc_norm": 0.7633587786259542,
"acc_norm_stderr": 0.03727673575596913
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.8148148148148148,
"acc_stderr": 0.03755265865037181,
"acc_norm": 0.8148148148148148,
"acc_norm_stderr": 0.03755265865037181
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7975460122699386,
"acc_stderr": 0.031570650789119005,
"acc_norm": 0.7975460122699386,
"acc_norm_stderr": 0.031570650789119005
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.45535714285714285,
"acc_stderr": 0.047268355537191,
"acc_norm": 0.45535714285714285,
"acc_norm_stderr": 0.047268355537191
},
"harness|hendrycksTest-management|5": {
"acc": 0.7864077669902912,
"acc_stderr": 0.040580420156460344,
"acc_norm": 0.7864077669902912,
"acc_norm_stderr": 0.040580420156460344
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8632478632478633,
"acc_stderr": 0.022509033937077802,
"acc_norm": 0.8632478632478633,
"acc_norm_stderr": 0.022509033937077802
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.73,
"acc_stderr": 0.044619604333847394,
"acc_norm": 0.73,
"acc_norm_stderr": 0.044619604333847394
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8352490421455939,
"acc_stderr": 0.013265346261323797,
"acc_norm": 0.8352490421455939,
"acc_norm_stderr": 0.013265346261323797
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7456647398843931,
"acc_stderr": 0.023445826276545543,
"acc_norm": 0.7456647398843931,
"acc_norm_stderr": 0.023445826276545543
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4491620111731844,
"acc_stderr": 0.016635838341631928,
"acc_norm": 0.4491620111731844,
"acc_norm_stderr": 0.016635838341631928
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7254901960784313,
"acc_stderr": 0.02555316999182652,
"acc_norm": 0.7254901960784313,
"acc_norm_stderr": 0.02555316999182652
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.6977491961414791,
"acc_stderr": 0.026082700695399665,
"acc_norm": 0.6977491961414791,
"acc_norm_stderr": 0.026082700695399665
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7438271604938271,
"acc_stderr": 0.024288533637726095,
"acc_norm": 0.7438271604938271,
"acc_norm_stderr": 0.024288533637726095
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4929078014184397,
"acc_stderr": 0.02982449855912901,
"acc_norm": 0.4929078014184397,
"acc_norm_stderr": 0.02982449855912901
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.46870925684485004,
"acc_stderr": 0.012745204626083136,
"acc_norm": 0.46870925684485004,
"acc_norm_stderr": 0.012745204626083136
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6985294117647058,
"acc_stderr": 0.027875982114273168,
"acc_norm": 0.6985294117647058,
"acc_norm_stderr": 0.027875982114273168
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6535947712418301,
"acc_stderr": 0.01924978569171721,
"acc_norm": 0.6535947712418301,
"acc_norm_stderr": 0.01924978569171721
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7306122448979592,
"acc_stderr": 0.02840125202902294,
"acc_norm": 0.7306122448979592,
"acc_norm_stderr": 0.02840125202902294
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.845771144278607,
"acc_stderr": 0.025538433368578327,
"acc_norm": 0.845771144278607,
"acc_norm_stderr": 0.025538433368578327
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5301204819277109,
"acc_stderr": 0.03885425420866767,
"acc_norm": 0.5301204819277109,
"acc_norm_stderr": 0.03885425420866767
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8538011695906432,
"acc_stderr": 0.027097290118070806,
"acc_norm": 0.8538011695906432,
"acc_norm_stderr": 0.027097290118070806
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4908200734394125,
"mc1_stderr": 0.01750055072481975,
"mc2": 0.6512373349905823,
"mc2_stderr": 0.015414384395752878
},
"harness|winogrande|5": {
"acc": 0.8089976322020521,
"acc_stderr": 0.011047808761510427
},
"harness|gsm8k|5": {
"acc": 0.6671721000758151,
"acc_stderr": 0.012979892496598283
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
Scorneddd/V_Lay | Scorneddd | "2024-03-10T17:01:05Z" | 0 | 0 | [
"license:apache-2.0",
"region:us"
] | null | "2024-03-10T16:52:23Z" | ---
license: apache-2.0
---
|
BramVanroy/ultra_feedback_dutch_cleaned_multi | BramVanroy | "2024-03-27T15:54:53Z" | 0 | 0 | [
"task_categories:text-generation",
"language:nl",
"license:cc-by-nc-4.0",
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us",
"conversational"
] | [
"text-generation"
] | "2024-03-10T16:58:28Z" | ---
language:
- nl
license: cc-by-nc-4.0
size_categories:
- 10K<n<100K
task_categories:
- text-generation
pretty_name: Ultra Feedback Dutch Cleaned
dataset_info:
features:
- name: GEITje-7B-ultra
dtype: string
- name: TowerInstruct-13B-v0.1
dtype: string
- name: TowerInstruct-7B-v0.2
dtype: string
- name: geitje-7b-chat
dtype: string
- name: gpt-4-turbo
dtype: string
- name: llama-2-13b-chat-dutch
dtype: string
- name: prompt
dtype: string
- name: prompt_dutch
dtype: string
splits:
- name: train
num_bytes: 624697211
num_examples: 59885
download_size: 362587024
dataset_size: 624697211
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
tags:
- conversational
---
# Ultra Feedback Dutch Cleaned
**This dataset should not be used unless you are interest in all model generations. Instead, refer to the rated and [further filtered version](https://huggingface.co/datasets/BramVanroy/ultra_feedback_dutch_cleaned/).**
---
This is a cleaned version of [BramVanroy/ultra_feedback_dutch](https://huggingface.co/datasets/BramVanroy/ultra_feedback_dutch), based on the [cleaning](https://huggingface.co/datasets/argilla/ultrafeedback-binarized-preferences-cleaned) done by Argilla on the original Ultra Feedback dataset.
It contains multiple LM responses from:
- GEITje-7B-ultra
- TowerInstruct-13B-v0.1
- TowerInstruct-7B-v0.2
- GEITje-7B-chat
- gpt-4-turbo
- llama-2-13b-chat-dutch
|
ryan2009/MCPH | ryan2009 | "2024-03-10T17:11:00Z" | 0 | 0 | [
"license:openrail",
"region:us"
] | null | "2024-03-10T17:01:54Z" | ---
license: openrail
---
|
NorGLM/NO-Multi-QA-Sum | NorGLM | "2024-10-01T18:29:08Z" | 0 | 1 | [
"language:no",
"license:cc-by-nc-sa-4.0",
"size_categories:n<1K",
"format:csv",
"modality:text",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"arxiv:2312.01314",
"region:us"
] | null | "2024-03-10T17:04:39Z" | ---
license: cc-by-nc-sa-4.0
language:
- 'no'
---
# Dataset Card
## Dataset Summary
NO-Multi-QA-Sum is a Norwegian multi-task human annotated dataset. It is a part of NLEBench Norwegian benchmarks, and can be used for evaluation of Machine reading comprehension, document-grounded question answering, abstractive summarization tasks of Language Models.
## Language
The data in NO-Alpaca-Plus are in Norwegian Bokmål.
## Data Instances
For each instance, there is an article string, category, summary string, and a list of question-answer pairs representing news article, news categorical information, abstractive summary to the article, and question-answer pairs based on the content of the news aricles.
An example instance is show as follows:
```
{
article: " (Stavanger Oilers – Sparta 2–0, 4–3 i kamper) Amerikanerne Rob Bordson (33) og Steven Whitney (31) sendte Oilers til sin første NM-finale siden 2017. – Det var sinnssykt. To bra lag og det var kult å spille. Sikkert kult å se på også. Men til syvende og sist synes jeg vi fortjener å vinne, sier Oilers-spiller Tommy Kristiansen til TV 2. Etter et sjansesløseri uten like måtte vertene fra «oljebyen» finne seg i å gå hele veien til 3. periode før pucken endelig gikk i nettet. Ludvig Hoff sendte en liten stikker inn til Rob Bordson som banket inn kampens første mål. Amerikaneren sendte DNB Arena til himmels, og Stavanger-fansen øynet håp om sin første NM-finale på fem år (de to siste sesongene har det ikke vært noen finale, på grunn av pandemien). Minutter senere spilte Markus Søberg seg alene med Sparta-keeper Tobias Normann, men sisteskansen kom seirende ut av duellen. Med et og et halvt minutt igjen på klokken tok Sparta ut keeperen sin i et desperat forsøk på å utligne. – Det er et bra lag vi møter. En tøff arena å spille i, med bra fans. Men jeg synes vi gir det en fair sjanse, men det gikk ikke i dag, dessverre, sier Normann til TV 2. Dessverre for gjestene resulterte det i at Oilers doblet ledelsen sin da Steven Whitney skøyt pucken mot åpent mål. Det var spikeren i kisten for Sparta, som måtte se finalehåpet ryke. I finalen møter de Storhamar, som tok seg videre fra semifinalene etter 4–1 i kamper mot Stjernen. Nå får lagene en drøy ukes pause. Stavanger møter i Storhamar hjemme i første finalekamp 2. påskedag.Kamp 1: Stavanger Oilers – Sparta 5–4Kamp 2: Sparta – Stavanger Oilers 3–2Kamp 3: Stavanger Oilers – Sparta 1–2Kamp 4: Sparta – Stavanger Oilers 1–2Kamp 5: Stavanger Oilers – Sparta 3–2Kamp 6: Sparta – Stavanger Oilers 2–1Kamp 7: Stavanger Oilers – Sparta 2–0Oilers vant Fjordkraftligaen med 105 poeng på sine 45 kamper i ligaen. Finalemotstander Storhamar endte helt nede på 6.-plass.",
category: Ishockey,
summary: " Stavanger Oilers har avansert til NM-finalen for første gang siden 2017, takket være spill av amerikanerne Rob Bordson og Steven Whitney. De vil møte Storhamar i finalen, etter å ha vunnet mot Sparta med spillresultatene 5–4, 1–2, 3–2, 2–1, og 2–0. Oilers klarte også å vinne Fjordkraftligaen med 105 poeng fra sine 45 kamper.",
question_answer: "[[' Hvem sendte Stavanger Oilers til NM-finalen?', ' Amerikanerne Rob Bordson og Steven Whitney sendte Oilers til NM-finalen.'], [' Hvilket år deltok Oilers sist i NM-finalen før dette året?', ' Oilers deltok sist i NM-finalen i 2017.'], [' Hva tror Sparta-keeper Tobias Normann om matchen?', ' Normann tror at de ga det en rettferdig sjanse, men det gikk dessverre ikke deres vei.'], [' Hvem skal passere Oilers i finalen?', ' Oilers skal møte Storhamar i NM-finalen.'], [' Hvilke resultater vant Oilers for å nå NM-finalen?', ' Oilers vant med spillresultatene 5–4, 1–2, 3–2, 2–1, 2–0 over Sparta for å nå NM-finalen.'], [' Hvordan gikk Oilers i Fjordkraftligaen?', ' Oilers vant Fjordkraftligaen med 105 poeng fra sine 45 kamper.']]"
}
```
## Data Split
The dataset is split into that the content of QA contains or not contains summary saved in *data_contain.csv* and *data_not_contain.csv* files respectively.
| | #articles |
|-------|---------------------|
| data_contain | 71 |
| data_not_contain | 396 |
## Citation Information
If you feel our work is helpful, please cite our paper:
```
@article{liu2023nlebench+,
title={NLEBench+ NorGLM: A Comprehensive Empirical Analysis and Benchmark Dataset for Generative Language Models in Norwegian},
author={Liu, Peng and Zhang, Lemei and Farup, Terje Nissen and Lauvrak, Even W and Ingvaldsen, Jon Espen and Eide, Simen and Gulla, Jon Atle and Yang, Zhirong},
journal={arXiv preprint arXiv:2312.01314},
year={2023}
}
```
|
wandb/deita-10k-v0-sft-latin | wandb | "2024-03-10T17:16:23Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T17:05:49Z" | ---
dataset_info:
features:
- name: prompt
dtype: string
- name: prompt_id
dtype: string
- name: messages
list:
- name: content
dtype: string
- name: role
dtype: string
splits:
- name: train_sft
num_bytes: 306404981
num_examples: 8553
- name: test_sft
num_bytes: 15688979
num_examples: 448
- name: train_gen
num_bytes: 294678464
num_examples: 8555
- name: test_gen
num_bytes: 15083472
num_examples: 448
download_size: 252913263
dataset_size: 631855896
configs:
- config_name: default
data_files:
- split: train_sft
path: data/train_sft-*
- split: test_sft
path: data/test_sft-*
- split: train_gen
path: data/train_gen-*
- split: test_gen
path: data/test_gen-*
---
Same as HuggingFaceH4/deita-10k-v0-sft but without non-latin text.
|
guyhadad01/Talmud-Hebrew | guyhadad01 | "2024-03-10T17:14:07Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T17:14:03Z" | ---
dataset_info:
features:
- name: id
dtype: string
- name: content
dtype: string
splits:
- name: train
num_bytes: 16557451
num_examples: 37
download_size: 7001416
dataset_size: 16557451
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
damand2061/innermore-x | damand2061 | "2024-03-10T17:26:52Z" | 0 | 0 | [
"license:cc-by-4.0",
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T17:21:27Z" | ---
license: cc-by-4.0
dataset_info:
features:
- name: tokens
dtype: string
- name: tags
dtype: string
splits:
- name: train
num_bytes: 203836
num_examples: 438
- name: validation
num_bytes: 26180
num_examples: 55
download_size: 79051
dataset_size: 230016
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: validation
path: data/validation-*
---
|
open-llm-leaderboard-old/details_Eric111__CatunaLaserPi-DPO | open-llm-leaderboard-old | "2024-03-10T17:22:23Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T17:22:02Z" | ---
pretty_name: Evaluation run of Eric111/CatunaLaserPi-DPO
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [Eric111/CatunaLaserPi-DPO](https://huggingface.co/Eric111/CatunaLaserPi-DPO)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 1 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_Eric111__CatunaLaserPi-DPO\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-10T17:19:49.270307](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__CatunaLaserPi-DPO/blob/main/results_2024-03-10T17-19-49.270307.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.655015176013988,\n\
\ \"acc_stderr\": 0.032068924852289675,\n \"acc_norm\": 0.6548524546452552,\n\
\ \"acc_norm_stderr\": 0.032731365332683136,\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.700075180667757,\n\
\ \"mc2_stderr\": 0.014884649824873666\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.7022184300341296,\n \"acc_stderr\": 0.01336308010724448,\n\
\ \"acc_norm\": 0.7295221843003413,\n \"acc_norm_stderr\": 0.012980954547659556\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.7109141605257917,\n\
\ \"acc_stderr\": 0.004524113671259706,\n \"acc_norm\": 0.8832901812387971,\n\
\ \"acc_norm_stderr\": 0.003204180072942376\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.33,\n \"acc_stderr\": 0.04725815626252605,\n \
\ \"acc_norm\": 0.33,\n \"acc_norm_stderr\": 0.04725815626252605\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.6592592592592592,\n\
\ \"acc_stderr\": 0.040943762699967926,\n \"acc_norm\": 0.6592592592592592,\n\
\ \"acc_norm_stderr\": 0.040943762699967926\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6907894736842105,\n \"acc_stderr\": 0.037610708698674805,\n\
\ \"acc_norm\": 0.6907894736842105,\n \"acc_norm_stderr\": 0.037610708698674805\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.63,\n\
\ \"acc_stderr\": 0.048523658709391,\n \"acc_norm\": 0.63,\n \
\ \"acc_norm_stderr\": 0.048523658709391\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.7056603773584905,\n \"acc_stderr\": 0.02804918631569525,\n\
\ \"acc_norm\": 0.7056603773584905,\n \"acc_norm_stderr\": 0.02804918631569525\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7708333333333334,\n\
\ \"acc_stderr\": 0.03514697467862388,\n \"acc_norm\": 0.7708333333333334,\n\
\ \"acc_norm_stderr\": 0.03514697467862388\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.48,\n \"acc_stderr\": 0.050211673156867795,\n \
\ \"acc_norm\": 0.48,\n \"acc_norm_stderr\": 0.050211673156867795\n \
\ },\n \"harness|hendrycksTest-college_computer_science|5\": {\n \"\
acc\": 0.54,\n \"acc_stderr\": 0.05009082659620333,\n \"acc_norm\"\
: 0.54,\n \"acc_norm_stderr\": 0.05009082659620333\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.31,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.31,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.653179190751445,\n\
\ \"acc_stderr\": 0.036291466701596636,\n \"acc_norm\": 0.653179190751445,\n\
\ \"acc_norm_stderr\": 0.036291466701596636\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.4117647058823529,\n \"acc_stderr\": 0.048971049527263666,\n\
\ \"acc_norm\": 0.4117647058823529,\n \"acc_norm_stderr\": 0.048971049527263666\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.75,\n \"acc_stderr\": 0.04351941398892446,\n \"acc_norm\": 0.75,\n\
\ \"acc_norm_stderr\": 0.04351941398892446\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5659574468085107,\n \"acc_stderr\": 0.03240038086792747,\n\
\ \"acc_norm\": 0.5659574468085107,\n \"acc_norm_stderr\": 0.03240038086792747\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.47368421052631576,\n\
\ \"acc_stderr\": 0.046970851366478626,\n \"acc_norm\": 0.47368421052631576,\n\
\ \"acc_norm_stderr\": 0.046970851366478626\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5586206896551724,\n \"acc_stderr\": 0.04137931034482757,\n\
\ \"acc_norm\": 0.5586206896551724,\n \"acc_norm_stderr\": 0.04137931034482757\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41005291005291006,\n \"acc_stderr\": 0.02533120243894443,\n \"\
acc_norm\": 0.41005291005291006,\n \"acc_norm_stderr\": 0.02533120243894443\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.49206349206349204,\n\
\ \"acc_stderr\": 0.044715725362943486,\n \"acc_norm\": 0.49206349206349204,\n\
\ \"acc_norm_stderr\": 0.044715725362943486\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.37,\n \"acc_stderr\": 0.04852365870939099,\n \
\ \"acc_norm\": 0.37,\n \"acc_norm_stderr\": 0.04852365870939099\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\": 0.7935483870967742,\n\
\ \"acc_stderr\": 0.023025899617188716,\n \"acc_norm\": 0.7935483870967742,\n\
\ \"acc_norm_stderr\": 0.023025899617188716\n },\n \"harness|hendrycksTest-high_school_chemistry|5\"\
: {\n \"acc\": 0.5123152709359606,\n \"acc_stderr\": 0.035169204442208966,\n\
\ \"acc_norm\": 0.5123152709359606,\n \"acc_norm_stderr\": 0.035169204442208966\n\
\ },\n \"harness|hendrycksTest-high_school_computer_science|5\": {\n \
\ \"acc\": 0.7,\n \"acc_stderr\": 0.046056618647183814,\n \"acc_norm\"\
: 0.7,\n \"acc_norm_stderr\": 0.046056618647183814\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7696969696969697,\n \"acc_stderr\": 0.032876667586034906,\n\
\ \"acc_norm\": 0.7696969696969697,\n \"acc_norm_stderr\": 0.032876667586034906\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.8080808080808081,\n \"acc_stderr\": 0.028057791672989017,\n \"\
acc_norm\": 0.8080808080808081,\n \"acc_norm_stderr\": 0.028057791672989017\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.9067357512953368,\n \"acc_stderr\": 0.02098685459328973,\n\
\ \"acc_norm\": 0.9067357512953368,\n \"acc_norm_stderr\": 0.02098685459328973\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.6666666666666666,\n \"acc_stderr\": 0.023901157979402534,\n\
\ \"acc_norm\": 0.6666666666666666,\n \"acc_norm_stderr\": 0.023901157979402534\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.34074074074074073,\n \"acc_stderr\": 0.028897748741131147,\n \
\ \"acc_norm\": 0.34074074074074073,\n \"acc_norm_stderr\": 0.028897748741131147\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.680672268907563,\n \"acc_stderr\": 0.0302839955258844,\n \
\ \"acc_norm\": 0.680672268907563,\n \"acc_norm_stderr\": 0.0302839955258844\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.36423841059602646,\n \"acc_stderr\": 0.03929111781242741,\n \"\
acc_norm\": 0.36423841059602646,\n \"acc_norm_stderr\": 0.03929111781242741\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8477064220183487,\n \"acc_stderr\": 0.015405084393157074,\n \"\
acc_norm\": 0.8477064220183487,\n \"acc_norm_stderr\": 0.015405084393157074\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5138888888888888,\n \"acc_stderr\": 0.034086558679777494,\n \"\
acc_norm\": 0.5138888888888888,\n \"acc_norm_stderr\": 0.034086558679777494\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.8480392156862745,\n \"acc_stderr\": 0.025195658428931792,\n \"\
acc_norm\": 0.8480392156862745,\n \"acc_norm_stderr\": 0.025195658428931792\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.8059071729957806,\n \"acc_stderr\": 0.025744902532290916,\n \
\ \"acc_norm\": 0.8059071729957806,\n \"acc_norm_stderr\": 0.025744902532290916\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.695067264573991,\n\
\ \"acc_stderr\": 0.030898610882477515,\n \"acc_norm\": 0.695067264573991,\n\
\ \"acc_norm_stderr\": 0.030898610882477515\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.7786259541984732,\n \"acc_stderr\": 0.03641297081313729,\n\
\ \"acc_norm\": 0.7786259541984732,\n \"acc_norm_stderr\": 0.03641297081313729\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.7851239669421488,\n \"acc_stderr\": 0.037494924487096966,\n \"\
acc_norm\": 0.7851239669421488,\n \"acc_norm_stderr\": 0.037494924487096966\n\
\ },\n \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7962962962962963,\n\
\ \"acc_stderr\": 0.03893542518824847,\n \"acc_norm\": 0.7962962962962963,\n\
\ \"acc_norm_stderr\": 0.03893542518824847\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7730061349693251,\n \"acc_stderr\": 0.03291099578615769,\n\
\ \"acc_norm\": 0.7730061349693251,\n \"acc_norm_stderr\": 0.03291099578615769\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.4732142857142857,\n\
\ \"acc_stderr\": 0.047389751192741546,\n \"acc_norm\": 0.4732142857142857,\n\
\ \"acc_norm_stderr\": 0.047389751192741546\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7669902912621359,\n \"acc_stderr\": 0.04185832598928315,\n\
\ \"acc_norm\": 0.7669902912621359,\n \"acc_norm_stderr\": 0.04185832598928315\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8803418803418803,\n\
\ \"acc_stderr\": 0.021262719400406957,\n \"acc_norm\": 0.8803418803418803,\n\
\ \"acc_norm_stderr\": 0.021262719400406957\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.71,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.71,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.8314176245210728,\n\
\ \"acc_stderr\": 0.013387895731543604,\n \"acc_norm\": 0.8314176245210728,\n\
\ \"acc_norm_stderr\": 0.013387895731543604\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7427745664739884,\n \"acc_stderr\": 0.023532925431044287,\n\
\ \"acc_norm\": 0.7427745664739884,\n \"acc_norm_stderr\": 0.023532925431044287\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.4301675977653631,\n\
\ \"acc_stderr\": 0.01655860163604104,\n \"acc_norm\": 0.4301675977653631,\n\
\ \"acc_norm_stderr\": 0.01655860163604104\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.7352941176470589,\n \"acc_stderr\": 0.02526169121972948,\n\
\ \"acc_norm\": 0.7352941176470589,\n \"acc_norm_stderr\": 0.02526169121972948\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.7041800643086816,\n\
\ \"acc_stderr\": 0.025922371788818767,\n \"acc_norm\": 0.7041800643086816,\n\
\ \"acc_norm_stderr\": 0.025922371788818767\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7407407407407407,\n \"acc_stderr\": 0.02438366553103545,\n\
\ \"acc_norm\": 0.7407407407407407,\n \"acc_norm_stderr\": 0.02438366553103545\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.49645390070921985,\n \"acc_stderr\": 0.02982674915328092,\n \
\ \"acc_norm\": 0.49645390070921985,\n \"acc_norm_stderr\": 0.02982674915328092\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.47327249022164275,\n\
\ \"acc_stderr\": 0.01275197796767601,\n \"acc_norm\": 0.47327249022164275,\n\
\ \"acc_norm_stderr\": 0.01275197796767601\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6727941176470589,\n \"acc_stderr\": 0.028501452860396553,\n\
\ \"acc_norm\": 0.6727941176470589,\n \"acc_norm_stderr\": 0.028501452860396553\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6781045751633987,\n \"acc_stderr\": 0.018901015322093092,\n \
\ \"acc_norm\": 0.6781045751633987,\n \"acc_norm_stderr\": 0.018901015322093092\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6727272727272727,\n\
\ \"acc_stderr\": 0.0449429086625209,\n \"acc_norm\": 0.6727272727272727,\n\
\ \"acc_norm_stderr\": 0.0449429086625209\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7183673469387755,\n \"acc_stderr\": 0.028795185574291293,\n\
\ \"acc_norm\": 0.7183673469387755,\n \"acc_norm_stderr\": 0.028795185574291293\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8557213930348259,\n\
\ \"acc_stderr\": 0.024845753212306046,\n \"acc_norm\": 0.8557213930348259,\n\
\ \"acc_norm_stderr\": 0.024845753212306046\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.85,\n \"acc_stderr\": 0.0358870281282637,\n \
\ \"acc_norm\": 0.85,\n \"acc_norm_stderr\": 0.0358870281282637\n },\n\
\ \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.572289156626506,\n\
\ \"acc_stderr\": 0.038515976837185335,\n \"acc_norm\": 0.572289156626506,\n\
\ \"acc_norm_stderr\": 0.038515976837185335\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.8304093567251462,\n \"acc_stderr\": 0.02878210810540171,\n\
\ \"acc_norm\": 0.8304093567251462,\n \"acc_norm_stderr\": 0.02878210810540171\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.565483476132191,\n\
\ \"mc1_stderr\": 0.01735273874925956,\n \"mc2\": 0.700075180667757,\n\
\ \"mc2_stderr\": 0.014884649824873666\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.8263614838200474,\n \"acc_stderr\": 0.010646116480330996\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.6929492039423806,\n \
\ \"acc_stderr\": 0.012705685723131707\n }\n}\n```"
repo_url: https://huggingface.co/Eric111/CatunaLaserPi-DPO
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|arc:challenge|25_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|gsm8k|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hellaswag|10_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-19-49.270307.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T17-19-49.270307.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- '**/details_harness|winogrande|5_2024-03-10T17-19-49.270307.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-10T17-19-49.270307.parquet'
- config_name: results
data_files:
- split: 2024_03_10T17_19_49.270307
path:
- results_2024-03-10T17-19-49.270307.parquet
- split: latest
path:
- results_2024-03-10T17-19-49.270307.parquet
---
# Dataset Card for Evaluation run of Eric111/CatunaLaserPi-DPO
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [Eric111/CatunaLaserPi-DPO](https://huggingface.co/Eric111/CatunaLaserPi-DPO) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 1 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_Eric111__CatunaLaserPi-DPO",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-10T17:19:49.270307](https://huggingface.co/datasets/open-llm-leaderboard/details_Eric111__CatunaLaserPi-DPO/blob/main/results_2024-03-10T17-19-49.270307.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.655015176013988,
"acc_stderr": 0.032068924852289675,
"acc_norm": 0.6548524546452552,
"acc_norm_stderr": 0.032731365332683136,
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.700075180667757,
"mc2_stderr": 0.014884649824873666
},
"harness|arc:challenge|25": {
"acc": 0.7022184300341296,
"acc_stderr": 0.01336308010724448,
"acc_norm": 0.7295221843003413,
"acc_norm_stderr": 0.012980954547659556
},
"harness|hellaswag|10": {
"acc": 0.7109141605257917,
"acc_stderr": 0.004524113671259706,
"acc_norm": 0.8832901812387971,
"acc_norm_stderr": 0.003204180072942376
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.33,
"acc_stderr": 0.04725815626252605,
"acc_norm": 0.33,
"acc_norm_stderr": 0.04725815626252605
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.6592592592592592,
"acc_stderr": 0.040943762699967926,
"acc_norm": 0.6592592592592592,
"acc_norm_stderr": 0.040943762699967926
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6907894736842105,
"acc_stderr": 0.037610708698674805,
"acc_norm": 0.6907894736842105,
"acc_norm_stderr": 0.037610708698674805
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.63,
"acc_stderr": 0.048523658709391,
"acc_norm": 0.63,
"acc_norm_stderr": 0.048523658709391
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.7056603773584905,
"acc_stderr": 0.02804918631569525,
"acc_norm": 0.7056603773584905,
"acc_norm_stderr": 0.02804918631569525
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7708333333333334,
"acc_stderr": 0.03514697467862388,
"acc_norm": 0.7708333333333334,
"acc_norm_stderr": 0.03514697467862388
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.48,
"acc_stderr": 0.050211673156867795,
"acc_norm": 0.48,
"acc_norm_stderr": 0.050211673156867795
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.54,
"acc_stderr": 0.05009082659620333,
"acc_norm": 0.54,
"acc_norm_stderr": 0.05009082659620333
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.31,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.31,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.653179190751445,
"acc_stderr": 0.036291466701596636,
"acc_norm": 0.653179190751445,
"acc_norm_stderr": 0.036291466701596636
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.4117647058823529,
"acc_stderr": 0.048971049527263666,
"acc_norm": 0.4117647058823529,
"acc_norm_stderr": 0.048971049527263666
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.75,
"acc_stderr": 0.04351941398892446,
"acc_norm": 0.75,
"acc_norm_stderr": 0.04351941398892446
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5659574468085107,
"acc_stderr": 0.03240038086792747,
"acc_norm": 0.5659574468085107,
"acc_norm_stderr": 0.03240038086792747
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.47368421052631576,
"acc_stderr": 0.046970851366478626,
"acc_norm": 0.47368421052631576,
"acc_norm_stderr": 0.046970851366478626
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5586206896551724,
"acc_stderr": 0.04137931034482757,
"acc_norm": 0.5586206896551724,
"acc_norm_stderr": 0.04137931034482757
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41005291005291006,
"acc_stderr": 0.02533120243894443,
"acc_norm": 0.41005291005291006,
"acc_norm_stderr": 0.02533120243894443
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.49206349206349204,
"acc_stderr": 0.044715725362943486,
"acc_norm": 0.49206349206349204,
"acc_norm_stderr": 0.044715725362943486
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.37,
"acc_stderr": 0.04852365870939099,
"acc_norm": 0.37,
"acc_norm_stderr": 0.04852365870939099
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7935483870967742,
"acc_stderr": 0.023025899617188716,
"acc_norm": 0.7935483870967742,
"acc_norm_stderr": 0.023025899617188716
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.5123152709359606,
"acc_stderr": 0.035169204442208966,
"acc_norm": 0.5123152709359606,
"acc_norm_stderr": 0.035169204442208966
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.7,
"acc_stderr": 0.046056618647183814,
"acc_norm": 0.7,
"acc_norm_stderr": 0.046056618647183814
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7696969696969697,
"acc_stderr": 0.032876667586034906,
"acc_norm": 0.7696969696969697,
"acc_norm_stderr": 0.032876667586034906
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.8080808080808081,
"acc_stderr": 0.028057791672989017,
"acc_norm": 0.8080808080808081,
"acc_norm_stderr": 0.028057791672989017
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.9067357512953368,
"acc_stderr": 0.02098685459328973,
"acc_norm": 0.9067357512953368,
"acc_norm_stderr": 0.02098685459328973
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.6666666666666666,
"acc_stderr": 0.023901157979402534,
"acc_norm": 0.6666666666666666,
"acc_norm_stderr": 0.023901157979402534
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.34074074074074073,
"acc_stderr": 0.028897748741131147,
"acc_norm": 0.34074074074074073,
"acc_norm_stderr": 0.028897748741131147
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.680672268907563,
"acc_stderr": 0.0302839955258844,
"acc_norm": 0.680672268907563,
"acc_norm_stderr": 0.0302839955258844
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.36423841059602646,
"acc_stderr": 0.03929111781242741,
"acc_norm": 0.36423841059602646,
"acc_norm_stderr": 0.03929111781242741
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8477064220183487,
"acc_stderr": 0.015405084393157074,
"acc_norm": 0.8477064220183487,
"acc_norm_stderr": 0.015405084393157074
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5138888888888888,
"acc_stderr": 0.034086558679777494,
"acc_norm": 0.5138888888888888,
"acc_norm_stderr": 0.034086558679777494
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.8480392156862745,
"acc_stderr": 0.025195658428931792,
"acc_norm": 0.8480392156862745,
"acc_norm_stderr": 0.025195658428931792
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.8059071729957806,
"acc_stderr": 0.025744902532290916,
"acc_norm": 0.8059071729957806,
"acc_norm_stderr": 0.025744902532290916
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.695067264573991,
"acc_stderr": 0.030898610882477515,
"acc_norm": 0.695067264573991,
"acc_norm_stderr": 0.030898610882477515
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.7786259541984732,
"acc_stderr": 0.03641297081313729,
"acc_norm": 0.7786259541984732,
"acc_norm_stderr": 0.03641297081313729
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.7851239669421488,
"acc_stderr": 0.037494924487096966,
"acc_norm": 0.7851239669421488,
"acc_norm_stderr": 0.037494924487096966
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7962962962962963,
"acc_stderr": 0.03893542518824847,
"acc_norm": 0.7962962962962963,
"acc_norm_stderr": 0.03893542518824847
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7730061349693251,
"acc_stderr": 0.03291099578615769,
"acc_norm": 0.7730061349693251,
"acc_norm_stderr": 0.03291099578615769
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.4732142857142857,
"acc_stderr": 0.047389751192741546,
"acc_norm": 0.4732142857142857,
"acc_norm_stderr": 0.047389751192741546
},
"harness|hendrycksTest-management|5": {
"acc": 0.7669902912621359,
"acc_stderr": 0.04185832598928315,
"acc_norm": 0.7669902912621359,
"acc_norm_stderr": 0.04185832598928315
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8803418803418803,
"acc_stderr": 0.021262719400406957,
"acc_norm": 0.8803418803418803,
"acc_norm_stderr": 0.021262719400406957
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.71,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.71,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.8314176245210728,
"acc_stderr": 0.013387895731543604,
"acc_norm": 0.8314176245210728,
"acc_norm_stderr": 0.013387895731543604
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7427745664739884,
"acc_stderr": 0.023532925431044287,
"acc_norm": 0.7427745664739884,
"acc_norm_stderr": 0.023532925431044287
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.4301675977653631,
"acc_stderr": 0.01655860163604104,
"acc_norm": 0.4301675977653631,
"acc_norm_stderr": 0.01655860163604104
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.7352941176470589,
"acc_stderr": 0.02526169121972948,
"acc_norm": 0.7352941176470589,
"acc_norm_stderr": 0.02526169121972948
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.7041800643086816,
"acc_stderr": 0.025922371788818767,
"acc_norm": 0.7041800643086816,
"acc_norm_stderr": 0.025922371788818767
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.02438366553103545,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.02438366553103545
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.49645390070921985,
"acc_stderr": 0.02982674915328092,
"acc_norm": 0.49645390070921985,
"acc_norm_stderr": 0.02982674915328092
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.47327249022164275,
"acc_stderr": 0.01275197796767601,
"acc_norm": 0.47327249022164275,
"acc_norm_stderr": 0.01275197796767601
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6727941176470589,
"acc_stderr": 0.028501452860396553,
"acc_norm": 0.6727941176470589,
"acc_norm_stderr": 0.028501452860396553
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6781045751633987,
"acc_stderr": 0.018901015322093092,
"acc_norm": 0.6781045751633987,
"acc_norm_stderr": 0.018901015322093092
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6727272727272727,
"acc_stderr": 0.0449429086625209,
"acc_norm": 0.6727272727272727,
"acc_norm_stderr": 0.0449429086625209
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7183673469387755,
"acc_stderr": 0.028795185574291293,
"acc_norm": 0.7183673469387755,
"acc_norm_stderr": 0.028795185574291293
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8557213930348259,
"acc_stderr": 0.024845753212306046,
"acc_norm": 0.8557213930348259,
"acc_norm_stderr": 0.024845753212306046
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.85,
"acc_stderr": 0.0358870281282637,
"acc_norm": 0.85,
"acc_norm_stderr": 0.0358870281282637
},
"harness|hendrycksTest-virology|5": {
"acc": 0.572289156626506,
"acc_stderr": 0.038515976837185335,
"acc_norm": 0.572289156626506,
"acc_norm_stderr": 0.038515976837185335
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.8304093567251462,
"acc_stderr": 0.02878210810540171,
"acc_norm": 0.8304093567251462,
"acc_norm_stderr": 0.02878210810540171
},
"harness|truthfulqa:mc|0": {
"mc1": 0.565483476132191,
"mc1_stderr": 0.01735273874925956,
"mc2": 0.700075180667757,
"mc2_stderr": 0.014884649824873666
},
"harness|winogrande|5": {
"acc": 0.8263614838200474,
"acc_stderr": 0.010646116480330996
},
"harness|gsm8k|5": {
"acc": 0.6929492039423806,
"acc_stderr": 0.012705685723131707
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
mcemilg/news-cat | mcemilg | "2024-03-10T17:39:09Z" | 0 | 1 | [
"task_categories:text-classification",
"language:tr",
"size_categories:1K<n<10K",
"format:json",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | [
"text-classification"
] | "2024-03-10T17:35:07Z" | ---
task_categories:
- text-classification
language:
- tr
---
Homepage: http://www.kemik.yildiz.edu.tr/veri_kumelerimiz.html |
guyhadad01/Talmud-Hebrew-tok | guyhadad01 | "2024-03-10T17:36:11Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T17:36:08Z" | ---
dataset_info:
features:
- name: input_ids
sequence: int32
splits:
- name: train
num_bytes: 8927832
num_examples: 17302
download_size: 5094023
dataset_size: 8927832
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
tuantmdev/task_training_v2 | tuantmdev | "2024-03-10T17:38:09Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T17:37:36Z" | ---
dataset_info:
features:
- name: question
dtype: string
- name: answer
dtype: string
splits:
- name: train
num_bytes: 128968346
num_examples: 50000
- name: test
num_bytes: 1271277
num_examples: 500
download_size: 18392608
dataset_size: 130239623
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
open-llm-leaderboard-old/details_wandb__mistral-7b-zephyr-dpo | open-llm-leaderboard-old | "2024-03-11T21:44:44Z" | 0 | 0 | [
"region:us"
] | null | "2024-03-10T17:42:51Z" | ---
pretty_name: Evaluation run of wandb/mistral-7b-zephyr-dpo
dataset_summary: "Dataset automatically created during the evaluation run of model\
\ [wandb/mistral-7b-zephyr-dpo](https://huggingface.co/wandb/mistral-7b-zephyr-dpo)\
\ on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).\n\
\nThe dataset is composed of 63 configuration, each one coresponding to one of the\
\ evaluated task.\n\nThe dataset has been created from 2 run(s). Each run can be\
\ found as a specific split in each configuration, the split being named using the\
\ timestamp of the run.The \"train\" split is always pointing to the latest results.\n\
\nAn additional configuration \"results\" store all the aggregated results of the\
\ run (and is used to compute and display the aggregated metrics on the [Open LLM\
\ Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).\n\
\nTo load the details from a run, you can for instance do the following:\n```python\n\
from datasets import load_dataset\ndata = load_dataset(\"open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo\"\
,\n\t\"harness_winogrande_5\",\n\tsplit=\"train\")\n```\n\n## Latest results\n\n\
These are the [latest results from run 2024-03-11T21:42:03.928518](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo/blob/main/results_2024-03-11T21-42-03.928518.json)(note\
\ that their might be results for other tasks in the repos if successive evals didn't\
\ cover the same tasks. You find each in the results and the \"latest\" split for\
\ each eval):\n\n```python\n{\n \"all\": {\n \"acc\": 0.6197175143025987,\n\
\ \"acc_stderr\": 0.032785226600484156,\n \"acc_norm\": 0.6241561892365968,\n\
\ \"acc_norm_stderr\": 0.03344678060029092,\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5660736416141117,\n\
\ \"mc2_stderr\": 0.015703591472463297\n },\n \"harness|arc:challenge|25\"\
: {\n \"acc\": 0.6126279863481229,\n \"acc_stderr\": 0.01423587248790987,\n\
\ \"acc_norm\": 0.6373720136518771,\n \"acc_norm_stderr\": 0.014049106564955012\n\
\ },\n \"harness|hellaswag|10\": {\n \"acc\": 0.6693885680143398,\n\
\ \"acc_stderr\": 0.004694718918225753,\n \"acc_norm\": 0.8578968333001394,\n\
\ \"acc_norm_stderr\": 0.003484423442092664\n },\n \"harness|hendrycksTest-abstract_algebra|5\"\
: {\n \"acc\": 0.29,\n \"acc_stderr\": 0.045604802157206845,\n \
\ \"acc_norm\": 0.29,\n \"acc_norm_stderr\": 0.045604802157206845\n \
\ },\n \"harness|hendrycksTest-anatomy|5\": {\n \"acc\": 0.5925925925925926,\n\
\ \"acc_stderr\": 0.04244633238353228,\n \"acc_norm\": 0.5925925925925926,\n\
\ \"acc_norm_stderr\": 0.04244633238353228\n },\n \"harness|hendrycksTest-astronomy|5\"\
: {\n \"acc\": 0.6644736842105263,\n \"acc_stderr\": 0.03842498559395269,\n\
\ \"acc_norm\": 0.6644736842105263,\n \"acc_norm_stderr\": 0.03842498559395269\n\
\ },\n \"harness|hendrycksTest-business_ethics|5\": {\n \"acc\": 0.58,\n\
\ \"acc_stderr\": 0.049604496374885836,\n \"acc_norm\": 0.58,\n \
\ \"acc_norm_stderr\": 0.049604496374885836\n },\n \"harness|hendrycksTest-clinical_knowledge|5\"\
: {\n \"acc\": 0.6792452830188679,\n \"acc_stderr\": 0.028727502957880263,\n\
\ \"acc_norm\": 0.6792452830188679,\n \"acc_norm_stderr\": 0.028727502957880263\n\
\ },\n \"harness|hendrycksTest-college_biology|5\": {\n \"acc\": 0.7222222222222222,\n\
\ \"acc_stderr\": 0.037455547914624555,\n \"acc_norm\": 0.7222222222222222,\n\
\ \"acc_norm_stderr\": 0.037455547914624555\n },\n \"harness|hendrycksTest-college_chemistry|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_computer_science|5\"\
: {\n \"acc\": 0.45,\n \"acc_stderr\": 0.05,\n \"acc_norm\"\
: 0.45,\n \"acc_norm_stderr\": 0.05\n },\n \"harness|hendrycksTest-college_mathematics|5\"\
: {\n \"acc\": 0.39,\n \"acc_stderr\": 0.04902071300001975,\n \
\ \"acc_norm\": 0.39,\n \"acc_norm_stderr\": 0.04902071300001975\n \
\ },\n \"harness|hendrycksTest-college_medicine|5\": {\n \"acc\": 0.6127167630057804,\n\
\ \"acc_stderr\": 0.03714325906302065,\n \"acc_norm\": 0.6127167630057804,\n\
\ \"acc_norm_stderr\": 0.03714325906302065\n },\n \"harness|hendrycksTest-college_physics|5\"\
: {\n \"acc\": 0.38235294117647056,\n \"acc_stderr\": 0.04835503696107223,\n\
\ \"acc_norm\": 0.38235294117647056,\n \"acc_norm_stderr\": 0.04835503696107223\n\
\ },\n \"harness|hendrycksTest-computer_security|5\": {\n \"acc\":\
\ 0.79,\n \"acc_stderr\": 0.04093601807403326,\n \"acc_norm\": 0.79,\n\
\ \"acc_norm_stderr\": 0.04093601807403326\n },\n \"harness|hendrycksTest-conceptual_physics|5\"\
: {\n \"acc\": 0.5404255319148936,\n \"acc_stderr\": 0.032579014820998356,\n\
\ \"acc_norm\": 0.5404255319148936,\n \"acc_norm_stderr\": 0.032579014820998356\n\
\ },\n \"harness|hendrycksTest-econometrics|5\": {\n \"acc\": 0.4649122807017544,\n\
\ \"acc_stderr\": 0.04692008381368909,\n \"acc_norm\": 0.4649122807017544,\n\
\ \"acc_norm_stderr\": 0.04692008381368909\n },\n \"harness|hendrycksTest-electrical_engineering|5\"\
: {\n \"acc\": 0.5310344827586206,\n \"acc_stderr\": 0.04158632762097828,\n\
\ \"acc_norm\": 0.5310344827586206,\n \"acc_norm_stderr\": 0.04158632762097828\n\
\ },\n \"harness|hendrycksTest-elementary_mathematics|5\": {\n \"acc\"\
: 0.41798941798941797,\n \"acc_stderr\": 0.025402555503260912,\n \"\
acc_norm\": 0.41798941798941797,\n \"acc_norm_stderr\": 0.025402555503260912\n\
\ },\n \"harness|hendrycksTest-formal_logic|5\": {\n \"acc\": 0.40476190476190477,\n\
\ \"acc_stderr\": 0.04390259265377562,\n \"acc_norm\": 0.40476190476190477,\n\
\ \"acc_norm_stderr\": 0.04390259265377562\n },\n \"harness|hendrycksTest-global_facts|5\"\
: {\n \"acc\": 0.32,\n \"acc_stderr\": 0.046882617226215034,\n \
\ \"acc_norm\": 0.32,\n \"acc_norm_stderr\": 0.046882617226215034\n \
\ },\n \"harness|hendrycksTest-high_school_biology|5\": {\n \"acc\"\
: 0.7451612903225806,\n \"acc_stderr\": 0.024790118459332208,\n \"\
acc_norm\": 0.7451612903225806,\n \"acc_norm_stderr\": 0.024790118459332208\n\
\ },\n \"harness|hendrycksTest-high_school_chemistry|5\": {\n \"acc\"\
: 0.4827586206896552,\n \"acc_stderr\": 0.035158955511657,\n \"acc_norm\"\
: 0.4827586206896552,\n \"acc_norm_stderr\": 0.035158955511657\n },\n\
\ \"harness|hendrycksTest-high_school_computer_science|5\": {\n \"acc\"\
: 0.59,\n \"acc_stderr\": 0.04943110704237101,\n \"acc_norm\": 0.59,\n\
\ \"acc_norm_stderr\": 0.04943110704237101\n },\n \"harness|hendrycksTest-high_school_european_history|5\"\
: {\n \"acc\": 0.7575757575757576,\n \"acc_stderr\": 0.03346409881055953,\n\
\ \"acc_norm\": 0.7575757575757576,\n \"acc_norm_stderr\": 0.03346409881055953\n\
\ },\n \"harness|hendrycksTest-high_school_geography|5\": {\n \"acc\"\
: 0.7626262626262627,\n \"acc_stderr\": 0.030313710538198892,\n \"\
acc_norm\": 0.7626262626262627,\n \"acc_norm_stderr\": 0.030313710538198892\n\
\ },\n \"harness|hendrycksTest-high_school_government_and_politics|5\": {\n\
\ \"acc\": 0.8704663212435233,\n \"acc_stderr\": 0.024233532297758733,\n\
\ \"acc_norm\": 0.8704663212435233,\n \"acc_norm_stderr\": 0.024233532297758733\n\
\ },\n \"harness|hendrycksTest-high_school_macroeconomics|5\": {\n \
\ \"acc\": 0.5897435897435898,\n \"acc_stderr\": 0.024939313906940798,\n\
\ \"acc_norm\": 0.5897435897435898,\n \"acc_norm_stderr\": 0.024939313906940798\n\
\ },\n \"harness|hendrycksTest-high_school_mathematics|5\": {\n \"\
acc\": 0.31851851851851853,\n \"acc_stderr\": 0.02840653309060846,\n \
\ \"acc_norm\": 0.31851851851851853,\n \"acc_norm_stderr\": 0.02840653309060846\n\
\ },\n \"harness|hendrycksTest-high_school_microeconomics|5\": {\n \
\ \"acc\": 0.6050420168067226,\n \"acc_stderr\": 0.03175367846096626,\n \
\ \"acc_norm\": 0.6050420168067226,\n \"acc_norm_stderr\": 0.03175367846096626\n\
\ },\n \"harness|hendrycksTest-high_school_physics|5\": {\n \"acc\"\
: 0.32450331125827814,\n \"acc_stderr\": 0.03822746937658753,\n \"\
acc_norm\": 0.32450331125827814,\n \"acc_norm_stderr\": 0.03822746937658753\n\
\ },\n \"harness|hendrycksTest-high_school_psychology|5\": {\n \"acc\"\
: 0.8110091743119267,\n \"acc_stderr\": 0.01678548115920363,\n \"\
acc_norm\": 0.8110091743119267,\n \"acc_norm_stderr\": 0.01678548115920363\n\
\ },\n \"harness|hendrycksTest-high_school_statistics|5\": {\n \"acc\"\
: 0.5046296296296297,\n \"acc_stderr\": 0.03409825519163572,\n \"\
acc_norm\": 0.5046296296296297,\n \"acc_norm_stderr\": 0.03409825519163572\n\
\ },\n \"harness|hendrycksTest-high_school_us_history|5\": {\n \"acc\"\
: 0.7647058823529411,\n \"acc_stderr\": 0.029771775228145628,\n \"\
acc_norm\": 0.7647058823529411,\n \"acc_norm_stderr\": 0.029771775228145628\n\
\ },\n \"harness|hendrycksTest-high_school_world_history|5\": {\n \"\
acc\": 0.7890295358649789,\n \"acc_stderr\": 0.02655837250266192,\n \
\ \"acc_norm\": 0.7890295358649789,\n \"acc_norm_stderr\": 0.02655837250266192\n\
\ },\n \"harness|hendrycksTest-human_aging|5\": {\n \"acc\": 0.6636771300448431,\n\
\ \"acc_stderr\": 0.031708824268455,\n \"acc_norm\": 0.6636771300448431,\n\
\ \"acc_norm_stderr\": 0.031708824268455\n },\n \"harness|hendrycksTest-human_sexuality|5\"\
: {\n \"acc\": 0.732824427480916,\n \"acc_stderr\": 0.038808483010823944,\n\
\ \"acc_norm\": 0.732824427480916,\n \"acc_norm_stderr\": 0.038808483010823944\n\
\ },\n \"harness|hendrycksTest-international_law|5\": {\n \"acc\":\
\ 0.768595041322314,\n \"acc_stderr\": 0.03849856098794088,\n \"acc_norm\"\
: 0.768595041322314,\n \"acc_norm_stderr\": 0.03849856098794088\n },\n\
\ \"harness|hendrycksTest-jurisprudence|5\": {\n \"acc\": 0.7407407407407407,\n\
\ \"acc_stderr\": 0.04236511258094634,\n \"acc_norm\": 0.7407407407407407,\n\
\ \"acc_norm_stderr\": 0.04236511258094634\n },\n \"harness|hendrycksTest-logical_fallacies|5\"\
: {\n \"acc\": 0.7668711656441718,\n \"acc_stderr\": 0.0332201579577674,\n\
\ \"acc_norm\": 0.7668711656441718,\n \"acc_norm_stderr\": 0.0332201579577674\n\
\ },\n \"harness|hendrycksTest-machine_learning|5\": {\n \"acc\": 0.5178571428571429,\n\
\ \"acc_stderr\": 0.04742762361243011,\n \"acc_norm\": 0.5178571428571429,\n\
\ \"acc_norm_stderr\": 0.04742762361243011\n },\n \"harness|hendrycksTest-management|5\"\
: {\n \"acc\": 0.7961165048543689,\n \"acc_stderr\": 0.03989139859531771,\n\
\ \"acc_norm\": 0.7961165048543689,\n \"acc_norm_stderr\": 0.03989139859531771\n\
\ },\n \"harness|hendrycksTest-marketing|5\": {\n \"acc\": 0.8717948717948718,\n\
\ \"acc_stderr\": 0.02190190511507333,\n \"acc_norm\": 0.8717948717948718,\n\
\ \"acc_norm_stderr\": 0.02190190511507333\n },\n \"harness|hendrycksTest-medical_genetics|5\"\
: {\n \"acc\": 0.69,\n \"acc_stderr\": 0.04648231987117316,\n \
\ \"acc_norm\": 0.69,\n \"acc_norm_stderr\": 0.04648231987117316\n \
\ },\n \"harness|hendrycksTest-miscellaneous|5\": {\n \"acc\": 0.7841634738186463,\n\
\ \"acc_stderr\": 0.014711684386139953,\n \"acc_norm\": 0.7841634738186463,\n\
\ \"acc_norm_stderr\": 0.014711684386139953\n },\n \"harness|hendrycksTest-moral_disputes|5\"\
: {\n \"acc\": 0.7225433526011561,\n \"acc_stderr\": 0.024105712607754307,\n\
\ \"acc_norm\": 0.7225433526011561,\n \"acc_norm_stderr\": 0.024105712607754307\n\
\ },\n \"harness|hendrycksTest-moral_scenarios|5\": {\n \"acc\": 0.3743016759776536,\n\
\ \"acc_stderr\": 0.01618544417945717,\n \"acc_norm\": 0.3743016759776536,\n\
\ \"acc_norm_stderr\": 0.01618544417945717\n },\n \"harness|hendrycksTest-nutrition|5\"\
: {\n \"acc\": 0.696078431372549,\n \"acc_stderr\": 0.026336613469046633,\n\
\ \"acc_norm\": 0.696078431372549,\n \"acc_norm_stderr\": 0.026336613469046633\n\
\ },\n \"harness|hendrycksTest-philosophy|5\": {\n \"acc\": 0.707395498392283,\n\
\ \"acc_stderr\": 0.02583989833487798,\n \"acc_norm\": 0.707395498392283,\n\
\ \"acc_norm_stderr\": 0.02583989833487798\n },\n \"harness|hendrycksTest-prehistory|5\"\
: {\n \"acc\": 0.7129629629629629,\n \"acc_stderr\": 0.02517104191530968,\n\
\ \"acc_norm\": 0.7129629629629629,\n \"acc_norm_stderr\": 0.02517104191530968\n\
\ },\n \"harness|hendrycksTest-professional_accounting|5\": {\n \"\
acc\": 0.4787234042553192,\n \"acc_stderr\": 0.029800481645628693,\n \
\ \"acc_norm\": 0.4787234042553192,\n \"acc_norm_stderr\": 0.029800481645628693\n\
\ },\n \"harness|hendrycksTest-professional_law|5\": {\n \"acc\": 0.4485006518904824,\n\
\ \"acc_stderr\": 0.012702317490559806,\n \"acc_norm\": 0.4485006518904824,\n\
\ \"acc_norm_stderr\": 0.012702317490559806\n },\n \"harness|hendrycksTest-professional_medicine|5\"\
: {\n \"acc\": 0.6433823529411765,\n \"acc_stderr\": 0.029097209568411952,\n\
\ \"acc_norm\": 0.6433823529411765,\n \"acc_norm_stderr\": 0.029097209568411952\n\
\ },\n \"harness|hendrycksTest-professional_psychology|5\": {\n \"\
acc\": 0.6683006535947712,\n \"acc_stderr\": 0.019047485239360378,\n \
\ \"acc_norm\": 0.6683006535947712,\n \"acc_norm_stderr\": 0.019047485239360378\n\
\ },\n \"harness|hendrycksTest-public_relations|5\": {\n \"acc\": 0.6545454545454545,\n\
\ \"acc_stderr\": 0.04554619617541054,\n \"acc_norm\": 0.6545454545454545,\n\
\ \"acc_norm_stderr\": 0.04554619617541054\n },\n \"harness|hendrycksTest-security_studies|5\"\
: {\n \"acc\": 0.7142857142857143,\n \"acc_stderr\": 0.028920583220675606,\n\
\ \"acc_norm\": 0.7142857142857143,\n \"acc_norm_stderr\": 0.028920583220675606\n\
\ },\n \"harness|hendrycksTest-sociology|5\": {\n \"acc\": 0.8059701492537313,\n\
\ \"acc_stderr\": 0.027962677604768917,\n \"acc_norm\": 0.8059701492537313,\n\
\ \"acc_norm_stderr\": 0.027962677604768917\n },\n \"harness|hendrycksTest-us_foreign_policy|5\"\
: {\n \"acc\": 0.84,\n \"acc_stderr\": 0.03684529491774709,\n \
\ \"acc_norm\": 0.84,\n \"acc_norm_stderr\": 0.03684529491774709\n \
\ },\n \"harness|hendrycksTest-virology|5\": {\n \"acc\": 0.5481927710843374,\n\
\ \"acc_stderr\": 0.03874371556587953,\n \"acc_norm\": 0.5481927710843374,\n\
\ \"acc_norm_stderr\": 0.03874371556587953\n },\n \"harness|hendrycksTest-world_religions|5\"\
: {\n \"acc\": 0.783625730994152,\n \"acc_stderr\": 0.031581495393387324,\n\
\ \"acc_norm\": 0.783625730994152,\n \"acc_norm_stderr\": 0.031581495393387324\n\
\ },\n \"harness|truthfulqa:mc|0\": {\n \"mc1\": 0.4149326805385557,\n\
\ \"mc1_stderr\": 0.017248314465805978,\n \"mc2\": 0.5660736416141117,\n\
\ \"mc2_stderr\": 0.015703591472463297\n },\n \"harness|winogrande|5\"\
: {\n \"acc\": 0.7821625887924231,\n \"acc_stderr\": 0.011601066079939324\n\
\ },\n \"harness|gsm8k|5\": {\n \"acc\": 0.4086429112964367,\n \
\ \"acc_stderr\": 0.013540639733342422\n }\n}\n```"
repo_url: https://huggingface.co/wandb/mistral-7b-zephyr-dpo
leaderboard_url: https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard
point_of_contact: clementine@hf.co
configs:
- config_name: harness_arc_challenge_25
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|arc:challenge|25_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|arc:challenge|25_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|arc:challenge|25_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_gsm8k_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|gsm8k|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|gsm8k|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|gsm8k|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hellaswag_10
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hellaswag|10_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hellaswag|10_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hellaswag|10_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-40-34.142017.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_abstract_algebra_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-abstract_algebra|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_anatomy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-anatomy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_astronomy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-astronomy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_business_ethics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-business_ethics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_clinical_knowledge_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-clinical_knowledge|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_biology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_biology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_chemistry_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_computer_science_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_mathematics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_medicine_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_medicine|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_college_physics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-college_physics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_computer_security_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-computer_security|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_conceptual_physics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-conceptual_physics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_econometrics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-econometrics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_electrical_engineering_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-electrical_engineering|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_elementary_mathematics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-elementary_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_formal_logic_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-formal_logic|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_global_facts_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-global_facts|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_biology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_biology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_chemistry_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_chemistry|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_computer_science_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_computer_science|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_european_history_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_european_history|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_geography_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_geography|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_government_and_politics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_government_and_politics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_macroeconomics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_macroeconomics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_mathematics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_mathematics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_microeconomics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_microeconomics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_physics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_physics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_psychology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_psychology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_statistics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_statistics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_us_history_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_us_history|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_high_school_world_history_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-high_school_world_history|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_human_aging_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_aging|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_human_sexuality_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-human_sexuality|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_international_law_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-international_law|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_jurisprudence_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-jurisprudence|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_logical_fallacies_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-logical_fallacies|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_machine_learning_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-machine_learning|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_management_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-management|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_marketing_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-marketing|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_medical_genetics_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-medical_genetics|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_miscellaneous_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-miscellaneous|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_moral_disputes_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_disputes|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_moral_scenarios_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-moral_scenarios|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_nutrition_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-nutrition|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_philosophy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-philosophy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_prehistory_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-prehistory|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_accounting_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_accounting|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_law_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_law|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_medicine_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_medicine|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_professional_psychology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-professional_psychology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_public_relations_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-public_relations|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_security_studies_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-security_studies|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_sociology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-sociology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_us_foreign_policy_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-us_foreign_policy|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_virology_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-virology|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_hendrycksTest_world_religions_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|hendrycksTest-world_religions|5_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_truthfulqa_mc_0
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|truthfulqa:mc|0_2024-03-11T21-42-03.928518.parquet'
- config_name: harness_winogrande_5
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- '**/details_harness|winogrande|5_2024-03-10T17-40-34.142017.parquet'
- split: 2024_03_11T21_42_03.928518
path:
- '**/details_harness|winogrande|5_2024-03-11T21-42-03.928518.parquet'
- split: latest
path:
- '**/details_harness|winogrande|5_2024-03-11T21-42-03.928518.parquet'
- config_name: results
data_files:
- split: 2024_03_10T17_40_34.142017
path:
- results_2024-03-10T17-40-34.142017.parquet
- split: 2024_03_11T21_42_03.928518
path:
- results_2024-03-11T21-42-03.928518.parquet
- split: latest
path:
- results_2024-03-11T21-42-03.928518.parquet
---
# Dataset Card for Evaluation run of wandb/mistral-7b-zephyr-dpo
<!-- Provide a quick summary of the dataset. -->
Dataset automatically created during the evaluation run of model [wandb/mistral-7b-zephyr-dpo](https://huggingface.co/wandb/mistral-7b-zephyr-dpo) on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard).
The dataset is composed of 63 configuration, each one coresponding to one of the evaluated task.
The dataset has been created from 2 run(s). Each run can be found as a specific split in each configuration, the split being named using the timestamp of the run.The "train" split is always pointing to the latest results.
An additional configuration "results" store all the aggregated results of the run (and is used to compute and display the aggregated metrics on the [Open LLM Leaderboard](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)).
To load the details from a run, you can for instance do the following:
```python
from datasets import load_dataset
data = load_dataset("open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo",
"harness_winogrande_5",
split="train")
```
## Latest results
These are the [latest results from run 2024-03-11T21:42:03.928518](https://huggingface.co/datasets/open-llm-leaderboard/details_wandb__mistral-7b-zephyr-dpo/blob/main/results_2024-03-11T21-42-03.928518.json)(note that their might be results for other tasks in the repos if successive evals didn't cover the same tasks. You find each in the results and the "latest" split for each eval):
```python
{
"all": {
"acc": 0.6197175143025987,
"acc_stderr": 0.032785226600484156,
"acc_norm": 0.6241561892365968,
"acc_norm_stderr": 0.03344678060029092,
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5660736416141117,
"mc2_stderr": 0.015703591472463297
},
"harness|arc:challenge|25": {
"acc": 0.6126279863481229,
"acc_stderr": 0.01423587248790987,
"acc_norm": 0.6373720136518771,
"acc_norm_stderr": 0.014049106564955012
},
"harness|hellaswag|10": {
"acc": 0.6693885680143398,
"acc_stderr": 0.004694718918225753,
"acc_norm": 0.8578968333001394,
"acc_norm_stderr": 0.003484423442092664
},
"harness|hendrycksTest-abstract_algebra|5": {
"acc": 0.29,
"acc_stderr": 0.045604802157206845,
"acc_norm": 0.29,
"acc_norm_stderr": 0.045604802157206845
},
"harness|hendrycksTest-anatomy|5": {
"acc": 0.5925925925925926,
"acc_stderr": 0.04244633238353228,
"acc_norm": 0.5925925925925926,
"acc_norm_stderr": 0.04244633238353228
},
"harness|hendrycksTest-astronomy|5": {
"acc": 0.6644736842105263,
"acc_stderr": 0.03842498559395269,
"acc_norm": 0.6644736842105263,
"acc_norm_stderr": 0.03842498559395269
},
"harness|hendrycksTest-business_ethics|5": {
"acc": 0.58,
"acc_stderr": 0.049604496374885836,
"acc_norm": 0.58,
"acc_norm_stderr": 0.049604496374885836
},
"harness|hendrycksTest-clinical_knowledge|5": {
"acc": 0.6792452830188679,
"acc_stderr": 0.028727502957880263,
"acc_norm": 0.6792452830188679,
"acc_norm_stderr": 0.028727502957880263
},
"harness|hendrycksTest-college_biology|5": {
"acc": 0.7222222222222222,
"acc_stderr": 0.037455547914624555,
"acc_norm": 0.7222222222222222,
"acc_norm_stderr": 0.037455547914624555
},
"harness|hendrycksTest-college_chemistry|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_computer_science|5": {
"acc": 0.45,
"acc_stderr": 0.05,
"acc_norm": 0.45,
"acc_norm_stderr": 0.05
},
"harness|hendrycksTest-college_mathematics|5": {
"acc": 0.39,
"acc_stderr": 0.04902071300001975,
"acc_norm": 0.39,
"acc_norm_stderr": 0.04902071300001975
},
"harness|hendrycksTest-college_medicine|5": {
"acc": 0.6127167630057804,
"acc_stderr": 0.03714325906302065,
"acc_norm": 0.6127167630057804,
"acc_norm_stderr": 0.03714325906302065
},
"harness|hendrycksTest-college_physics|5": {
"acc": 0.38235294117647056,
"acc_stderr": 0.04835503696107223,
"acc_norm": 0.38235294117647056,
"acc_norm_stderr": 0.04835503696107223
},
"harness|hendrycksTest-computer_security|5": {
"acc": 0.79,
"acc_stderr": 0.04093601807403326,
"acc_norm": 0.79,
"acc_norm_stderr": 0.04093601807403326
},
"harness|hendrycksTest-conceptual_physics|5": {
"acc": 0.5404255319148936,
"acc_stderr": 0.032579014820998356,
"acc_norm": 0.5404255319148936,
"acc_norm_stderr": 0.032579014820998356
},
"harness|hendrycksTest-econometrics|5": {
"acc": 0.4649122807017544,
"acc_stderr": 0.04692008381368909,
"acc_norm": 0.4649122807017544,
"acc_norm_stderr": 0.04692008381368909
},
"harness|hendrycksTest-electrical_engineering|5": {
"acc": 0.5310344827586206,
"acc_stderr": 0.04158632762097828,
"acc_norm": 0.5310344827586206,
"acc_norm_stderr": 0.04158632762097828
},
"harness|hendrycksTest-elementary_mathematics|5": {
"acc": 0.41798941798941797,
"acc_stderr": 0.025402555503260912,
"acc_norm": 0.41798941798941797,
"acc_norm_stderr": 0.025402555503260912
},
"harness|hendrycksTest-formal_logic|5": {
"acc": 0.40476190476190477,
"acc_stderr": 0.04390259265377562,
"acc_norm": 0.40476190476190477,
"acc_norm_stderr": 0.04390259265377562
},
"harness|hendrycksTest-global_facts|5": {
"acc": 0.32,
"acc_stderr": 0.046882617226215034,
"acc_norm": 0.32,
"acc_norm_stderr": 0.046882617226215034
},
"harness|hendrycksTest-high_school_biology|5": {
"acc": 0.7451612903225806,
"acc_stderr": 0.024790118459332208,
"acc_norm": 0.7451612903225806,
"acc_norm_stderr": 0.024790118459332208
},
"harness|hendrycksTest-high_school_chemistry|5": {
"acc": 0.4827586206896552,
"acc_stderr": 0.035158955511657,
"acc_norm": 0.4827586206896552,
"acc_norm_stderr": 0.035158955511657
},
"harness|hendrycksTest-high_school_computer_science|5": {
"acc": 0.59,
"acc_stderr": 0.04943110704237101,
"acc_norm": 0.59,
"acc_norm_stderr": 0.04943110704237101
},
"harness|hendrycksTest-high_school_european_history|5": {
"acc": 0.7575757575757576,
"acc_stderr": 0.03346409881055953,
"acc_norm": 0.7575757575757576,
"acc_norm_stderr": 0.03346409881055953
},
"harness|hendrycksTest-high_school_geography|5": {
"acc": 0.7626262626262627,
"acc_stderr": 0.030313710538198892,
"acc_norm": 0.7626262626262627,
"acc_norm_stderr": 0.030313710538198892
},
"harness|hendrycksTest-high_school_government_and_politics|5": {
"acc": 0.8704663212435233,
"acc_stderr": 0.024233532297758733,
"acc_norm": 0.8704663212435233,
"acc_norm_stderr": 0.024233532297758733
},
"harness|hendrycksTest-high_school_macroeconomics|5": {
"acc": 0.5897435897435898,
"acc_stderr": 0.024939313906940798,
"acc_norm": 0.5897435897435898,
"acc_norm_stderr": 0.024939313906940798
},
"harness|hendrycksTest-high_school_mathematics|5": {
"acc": 0.31851851851851853,
"acc_stderr": 0.02840653309060846,
"acc_norm": 0.31851851851851853,
"acc_norm_stderr": 0.02840653309060846
},
"harness|hendrycksTest-high_school_microeconomics|5": {
"acc": 0.6050420168067226,
"acc_stderr": 0.03175367846096626,
"acc_norm": 0.6050420168067226,
"acc_norm_stderr": 0.03175367846096626
},
"harness|hendrycksTest-high_school_physics|5": {
"acc": 0.32450331125827814,
"acc_stderr": 0.03822746937658753,
"acc_norm": 0.32450331125827814,
"acc_norm_stderr": 0.03822746937658753
},
"harness|hendrycksTest-high_school_psychology|5": {
"acc": 0.8110091743119267,
"acc_stderr": 0.01678548115920363,
"acc_norm": 0.8110091743119267,
"acc_norm_stderr": 0.01678548115920363
},
"harness|hendrycksTest-high_school_statistics|5": {
"acc": 0.5046296296296297,
"acc_stderr": 0.03409825519163572,
"acc_norm": 0.5046296296296297,
"acc_norm_stderr": 0.03409825519163572
},
"harness|hendrycksTest-high_school_us_history|5": {
"acc": 0.7647058823529411,
"acc_stderr": 0.029771775228145628,
"acc_norm": 0.7647058823529411,
"acc_norm_stderr": 0.029771775228145628
},
"harness|hendrycksTest-high_school_world_history|5": {
"acc": 0.7890295358649789,
"acc_stderr": 0.02655837250266192,
"acc_norm": 0.7890295358649789,
"acc_norm_stderr": 0.02655837250266192
},
"harness|hendrycksTest-human_aging|5": {
"acc": 0.6636771300448431,
"acc_stderr": 0.031708824268455,
"acc_norm": 0.6636771300448431,
"acc_norm_stderr": 0.031708824268455
},
"harness|hendrycksTest-human_sexuality|5": {
"acc": 0.732824427480916,
"acc_stderr": 0.038808483010823944,
"acc_norm": 0.732824427480916,
"acc_norm_stderr": 0.038808483010823944
},
"harness|hendrycksTest-international_law|5": {
"acc": 0.768595041322314,
"acc_stderr": 0.03849856098794088,
"acc_norm": 0.768595041322314,
"acc_norm_stderr": 0.03849856098794088
},
"harness|hendrycksTest-jurisprudence|5": {
"acc": 0.7407407407407407,
"acc_stderr": 0.04236511258094634,
"acc_norm": 0.7407407407407407,
"acc_norm_stderr": 0.04236511258094634
},
"harness|hendrycksTest-logical_fallacies|5": {
"acc": 0.7668711656441718,
"acc_stderr": 0.0332201579577674,
"acc_norm": 0.7668711656441718,
"acc_norm_stderr": 0.0332201579577674
},
"harness|hendrycksTest-machine_learning|5": {
"acc": 0.5178571428571429,
"acc_stderr": 0.04742762361243011,
"acc_norm": 0.5178571428571429,
"acc_norm_stderr": 0.04742762361243011
},
"harness|hendrycksTest-management|5": {
"acc": 0.7961165048543689,
"acc_stderr": 0.03989139859531771,
"acc_norm": 0.7961165048543689,
"acc_norm_stderr": 0.03989139859531771
},
"harness|hendrycksTest-marketing|5": {
"acc": 0.8717948717948718,
"acc_stderr": 0.02190190511507333,
"acc_norm": 0.8717948717948718,
"acc_norm_stderr": 0.02190190511507333
},
"harness|hendrycksTest-medical_genetics|5": {
"acc": 0.69,
"acc_stderr": 0.04648231987117316,
"acc_norm": 0.69,
"acc_norm_stderr": 0.04648231987117316
},
"harness|hendrycksTest-miscellaneous|5": {
"acc": 0.7841634738186463,
"acc_stderr": 0.014711684386139953,
"acc_norm": 0.7841634738186463,
"acc_norm_stderr": 0.014711684386139953
},
"harness|hendrycksTest-moral_disputes|5": {
"acc": 0.7225433526011561,
"acc_stderr": 0.024105712607754307,
"acc_norm": 0.7225433526011561,
"acc_norm_stderr": 0.024105712607754307
},
"harness|hendrycksTest-moral_scenarios|5": {
"acc": 0.3743016759776536,
"acc_stderr": 0.01618544417945717,
"acc_norm": 0.3743016759776536,
"acc_norm_stderr": 0.01618544417945717
},
"harness|hendrycksTest-nutrition|5": {
"acc": 0.696078431372549,
"acc_stderr": 0.026336613469046633,
"acc_norm": 0.696078431372549,
"acc_norm_stderr": 0.026336613469046633
},
"harness|hendrycksTest-philosophy|5": {
"acc": 0.707395498392283,
"acc_stderr": 0.02583989833487798,
"acc_norm": 0.707395498392283,
"acc_norm_stderr": 0.02583989833487798
},
"harness|hendrycksTest-prehistory|5": {
"acc": 0.7129629629629629,
"acc_stderr": 0.02517104191530968,
"acc_norm": 0.7129629629629629,
"acc_norm_stderr": 0.02517104191530968
},
"harness|hendrycksTest-professional_accounting|5": {
"acc": 0.4787234042553192,
"acc_stderr": 0.029800481645628693,
"acc_norm": 0.4787234042553192,
"acc_norm_stderr": 0.029800481645628693
},
"harness|hendrycksTest-professional_law|5": {
"acc": 0.4485006518904824,
"acc_stderr": 0.012702317490559806,
"acc_norm": 0.4485006518904824,
"acc_norm_stderr": 0.012702317490559806
},
"harness|hendrycksTest-professional_medicine|5": {
"acc": 0.6433823529411765,
"acc_stderr": 0.029097209568411952,
"acc_norm": 0.6433823529411765,
"acc_norm_stderr": 0.029097209568411952
},
"harness|hendrycksTest-professional_psychology|5": {
"acc": 0.6683006535947712,
"acc_stderr": 0.019047485239360378,
"acc_norm": 0.6683006535947712,
"acc_norm_stderr": 0.019047485239360378
},
"harness|hendrycksTest-public_relations|5": {
"acc": 0.6545454545454545,
"acc_stderr": 0.04554619617541054,
"acc_norm": 0.6545454545454545,
"acc_norm_stderr": 0.04554619617541054
},
"harness|hendrycksTest-security_studies|5": {
"acc": 0.7142857142857143,
"acc_stderr": 0.028920583220675606,
"acc_norm": 0.7142857142857143,
"acc_norm_stderr": 0.028920583220675606
},
"harness|hendrycksTest-sociology|5": {
"acc": 0.8059701492537313,
"acc_stderr": 0.027962677604768917,
"acc_norm": 0.8059701492537313,
"acc_norm_stderr": 0.027962677604768917
},
"harness|hendrycksTest-us_foreign_policy|5": {
"acc": 0.84,
"acc_stderr": 0.03684529491774709,
"acc_norm": 0.84,
"acc_norm_stderr": 0.03684529491774709
},
"harness|hendrycksTest-virology|5": {
"acc": 0.5481927710843374,
"acc_stderr": 0.03874371556587953,
"acc_norm": 0.5481927710843374,
"acc_norm_stderr": 0.03874371556587953
},
"harness|hendrycksTest-world_religions|5": {
"acc": 0.783625730994152,
"acc_stderr": 0.031581495393387324,
"acc_norm": 0.783625730994152,
"acc_norm_stderr": 0.031581495393387324
},
"harness|truthfulqa:mc|0": {
"mc1": 0.4149326805385557,
"mc1_stderr": 0.017248314465805978,
"mc2": 0.5660736416141117,
"mc2_stderr": 0.015703591472463297
},
"harness|winogrande|5": {
"acc": 0.7821625887924231,
"acc_stderr": 0.011601066079939324
},
"harness|gsm8k|5": {
"acc": 0.4086429112964367,
"acc_stderr": 0.013540639733342422
}
}
```
## Dataset Details
### Dataset Description
<!-- Provide a longer summary of what this dataset is. -->
- **Curated by:** [More Information Needed]
- **Funded by [optional]:** [More Information Needed]
- **Shared by [optional]:** [More Information Needed]
- **Language(s) (NLP):** [More Information Needed]
- **License:** [More Information Needed]
### Dataset Sources [optional]
<!-- Provide the basic links for the dataset. -->
- **Repository:** [More Information Needed]
- **Paper [optional]:** [More Information Needed]
- **Demo [optional]:** [More Information Needed]
## Uses
<!-- Address questions around how the dataset is intended to be used. -->
### Direct Use
<!-- This section describes suitable use cases for the dataset. -->
[More Information Needed]
### Out-of-Scope Use
<!-- This section addresses misuse, malicious use, and uses that the dataset will not work well for. -->
[More Information Needed]
## Dataset Structure
<!-- This section provides a description of the dataset fields, and additional information about the dataset structure such as criteria used to create the splits, relationships between data points, etc. -->
[More Information Needed]
## Dataset Creation
### Curation Rationale
<!-- Motivation for the creation of this dataset. -->
[More Information Needed]
### Source Data
<!-- This section describes the source data (e.g. news text and headlines, social media posts, translated sentences, ...). -->
#### Data Collection and Processing
<!-- This section describes the data collection and processing process such as data selection criteria, filtering and normalization methods, tools and libraries used, etc. -->
[More Information Needed]
#### Who are the source data producers?
<!-- This section describes the people or systems who originally created the data. It should also include self-reported demographic or identity information for the source data creators if this information is available. -->
[More Information Needed]
### Annotations [optional]
<!-- If the dataset contains annotations which are not part of the initial data collection, use this section to describe them. -->
#### Annotation process
<!-- This section describes the annotation process such as annotation tools used in the process, the amount of data annotated, annotation guidelines provided to the annotators, interannotator statistics, annotation validation, etc. -->
[More Information Needed]
#### Who are the annotators?
<!-- This section describes the people or systems who created the annotations. -->
[More Information Needed]
#### Personal and Sensitive Information
<!-- State whether the dataset contains data that might be considered personal, sensitive, or private (e.g., data that reveals addresses, uniquely identifiable names or aliases, racial or ethnic origins, sexual orientations, religious beliefs, political opinions, financial or health data, etc.). If efforts were made to anonymize the data, describe the anonymization process. -->
[More Information Needed]
## Bias, Risks, and Limitations
<!-- This section is meant to convey both technical and sociotechnical limitations. -->
[More Information Needed]
### Recommendations
<!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
Users should be made aware of the risks, biases and limitations of the dataset. More information needed for further recommendations.
## Citation [optional]
<!-- If there is a paper or blog post introducing the dataset, the APA and Bibtex information for that should go in this section. -->
**BibTeX:**
[More Information Needed]
**APA:**
[More Information Needed]
## Glossary [optional]
<!-- If relevant, include terms and calculations in this section that can help readers understand the dataset or dataset card. -->
[More Information Needed]
## More Information [optional]
[More Information Needed]
## Dataset Card Authors [optional]
[More Information Needed]
## Dataset Card Contact
[More Information Needed] |
ravithejads/alpaca-cleaned | ravithejads | "2024-03-10T17:48:38Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T17:45:04Z" | ---
dataset_info:
features:
- name: instruction
dtype: string
- name: input
dtype: string
- name: output
dtype: string
splits:
- name: train
num_bytes: 21732355.77190881
num_examples: 27713
download_size: 16687287
dataset_size: 21732355.77190881
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|
SarcasmNet/self-annotated_reddit_climate_comment | SarcasmNet | "2024-03-10T18:34:41Z" | 0 | 1 | [
"language:en",
"license:mit",
"region:us",
"climate",
"environment",
"reddit",
"comment",
"sarcasm",
"self-annotated"
] | null | "2024-03-10T17:55:18Z" | ---
license: mit
language:
- en
tags:
- climate
- environment
- reddit
- comment
- sarcasm
- self-annotated
pretty_name: Self-AnnotatedRedditClimateComment
splits:
- name: train
dataset_size: 580kb
---
# Dataset Card for Self-annotated Reddit Climate Comment
## Dataset Structure
This JSON example represents an example portion of the dataset. This nested structure allows for efficient navigation and analysis of posts, comments, and replies within specific subreddit communities and individual posts.
```json
{
"id": "1006cei",
"post_title": "Amazing Water Filter Invention",
"post_author": "User123",
"post_body": "Check out this incredible water filter!",
"post_url": "https://example.com/water_filter",
"post_pic": "https://example.com/images/water_filter.jpg",
"subreddit": "inventions",
"post_timestamp": "2023-01-01T12:00:00Z",
"post_upvotes": 123,
"post_permalink": "/r/inventions/comments/1006cei/amazing_water_filter_invention/",
"comments": {
"CommentID": ["abc123", "def456"],
"CommentAuthor": ["User456", "User789"],
"CommentBody": ["This is awesome!", "How does it work?"],
"CommentTimestamp": ["2023-01-01T12:30:00Z", "2023-01-01T13:00:00Z"],
"CommentUpvotes": [5, 7],
"CommentPermalink": ["/r/inventions/comments/1006cei/amazing_water_filter_invention/abc123/", "/r/inventions/comments/1006cei/amazing_water_filter_invention/def456/"],
"Label": [1,0]
]
}
}
```
The dataset includes the following fields:
```json
id: string - Unique identifier for the post.
post_title: string - Title of the post.
post_author: string - Username of the author who posted.
post_body: string - Body/content of the post.
post_url: string - URL of the post.
post_pic: Image - Image associated with the post.
subreddit: string - Subreddit where the post was made.
post_timestamp: string - Timestamp of when the post was made.
post_upvotes: int32 - Number of upvotes the post received.
post_permalink: string - Permanent link to the post.
comments: Sequence - Sequence of comments associated with the post.
- CommentID: string - Unique identifier for the comment.
- CommentAuthor: string - Username of the comment author.
- CommentBody: string - Content/body of the comment.
- CommentTimestamp: string - Timestamp of when the comment was made.
- CommentUpvotes: int32 - Number of upvotes the comment received.
- CommentPermalink: string - Permanent link to the comment.
- Label: int32 - Label a comment as sarcastic or neutral (1 or 0).
```
### Recommendations
Users should be made aware of the risks, biases, and limitations of the dataset. More information is needed for further recommendations.
## Citation
**BibTeX:**
```bibtex
@InProceedings{huggingface:dataset,
title = {Self-annotated Reddit Climate Comment},
author={Catherine Wang, Ziyuan Ma},
year={2024}
}
```
|
Balassar/balassarprofile | Balassar | "2024-03-10T19:44:38Z" | 0 | 0 | [
"size_categories:n<1K",
"format:parquet",
"modality:text",
"library:datasets",
"library:pandas",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T18:01:23Z" | ---
dataset_info:
features:
- name: data_input
dtype: string
splits:
- name: train
num_bytes: 5337.6
num_examples: 16
- name: test
num_bytes: 1334.4
num_examples: 4
download_size: 9527
dataset_size: 6672.0
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
- split: test
path: data/test-*
---
|
aryamannningombam/indian-female-combined-tts-final | aryamannningombam | "2024-03-10T18:16:36Z" | 0 | 0 | [
"size_categories:10K<n<100K",
"format:parquet",
"library:datasets",
"library:dask",
"library:mlcroissant",
"library:polars",
"region:us"
] | null | "2024-03-10T18:14:06Z" | ---
dataset_info:
features:
- name: input_ids
sequence: int32
- name: labels
sequence:
sequence: float32
- name: speaker_embeddings
sequence: float32
splits:
- name: train
num_bytes: 4024730180
num_examples: 49836
download_size: 4034304679
dataset_size: 4024730180
configs:
- config_name: default
data_files:
- split: train
path: data/train-*
---
|